Privacy Pt 2: Python Politics: the Economics of Knowledge Accumulation

clothed man arrested by naked guys
“A constricting snake like a boa or a python kills its prey by suffocation. It uses the momentum of its strike to throw coils around its victim’s body. Then, it squeezes. Every time the prey exhales, the snake squeezes a little more tightly. Soon, the victim can breathe any more.” ref

Essentially, then, constrictors kill not by crushing but by taking in the slack and not giving it back. I view this as relevant to the topic at hand.

History: the Internet Wants to Be Free.

We seem to have generated a cultural lock-in that demands a free Internet. This is the result of a combination of factors: successful initiatives, ideological leanings (sometimes based on the semantic ambiguity between free as unregulated and free as without cost), and significant self-delusion.

There is a component of this freedom that is real. The freedom exist in a commons collectively built and shared. It exists in the group effort to build standards and protocols to connect us and to share and display information. It exists in the collection of individual visions, not always mutually compatible, that maintain and extend this effort. It exists in our ability to design technical hacks to undermine systems of control.

Wikipedia is a good exemplar of this Internet.

Beyond that, it’s not free at all.

Our vision is embodied in components all of which have both capital and maintenance costs.

The initial freedom was built on simple design hacks and the low bandwidth required to move text around. It sat on top of large computing projects in academia and government and survived on crumbs snatched off the table. You might have to pay a bit for your hook to the the infrastructure but content was provided mostly without cost. This was the world of bulletin boards, irc, usenet groups, email, text based MUDs.

This model has been extended with increasing elaborate ‘free’ content and services: Google, youTube, Flickr, facebook, Pandora,  Spodify, Twitter and so on. Meanwhile, temporary successes in the paid model, eg AOL, generally seem to erode back to free.

This apparent continuity masks a simple fact: the old model didn’t scale. We benefit from increasing costly products. A new funding source…advertising, of course…by and large pays the bill.

Google is the exemplar, here.

Which brings us to the common wisdom: if the product appears free, then you’re the product.

The true coin of this transaction is information about us individually and collectively.

The economics of targeted advertising is pretty simple conceptually. Information is gathered on you to give you advertising you’re likely to respond to. Or perhaps to provide you a more tailored service that will make you more engaged and hence bring you to the advertising more frequently.

With Google this was an add-on that went from humble beginnings. Now the business plans of start-ups are often designed specifically to elicit information. Products are built around data mining and targeting advertising.

We gain significant benefit from this model and here we, perhaps, become complicit.

 This march toward greater public visibility has generated other elements of common wisdom which are more suspect.

You often hear the following: “Privacy is dead. Things are different than in the past. We need to get over it and adjust to new realities. Privacy is already fairly illusory therefore upset is unjustified; just look at the targeted direct mail of the past” And so on.

To my mind, an appeal to the hopelessness of implementing change regardless of that changes desirability somehow lacks punch. We live in a world that’s changing all the time. We live in a world in which an improved technical solution or compelling idea can turn things upside down in considerable less than a decade. And we are inventing that technology every day.

There is a danger that I think it is our responsibility to recognize and attempt to address.

My argument is —

  1. There’s an unremitting economic pressure to create a higher and higher resolution picture of you, what you do, and what you might potentially consume.
  2. We’ll hit a point (and are there already in some locations) where you’ll stand out as suspicious by not being transparent. Given this fine grained picture, not being highly visible will become increasingly suspicious. The blurry individuals then stand out as very likely criminals or revolutionaries.
  3. The most likely to succeed in hiding from scrutiny behind a false front is the deliberate criminal with a carefully constructed public identity. The most likely to fail is the accidental revolutionary…the well meaning public citizen that is compelled to move into some sort of rebellion.
  4. Here’s the NYRB on Chinese dissident Fang Lizhi:

Fang’s path through life observed a pattern that is common to China’s dissidents: a person begins with socialist ideals, feels bitter when the rulers betray the ideals, resorts to outspoken criticism, and ends in prison or exile. Liu Binyan, Wang Ruowang, Su Xiaokang, Hu Ping, Zheng Yi, Liu Xiaobo, and many others have followed this pattern.

Let’s return to a police action in Minnesota cited in the PS to Part 1.

I suspect that these raids proved highly effective in disrupting the protest and that it would have been less so had the naive folks raided been up to actual criminal activity and taken steps to keep backup materials and tools hidden away somewhere. In other words, if they had been the folks that justified a disruptive police action then that action would likely have been ineffective.

Is there an Engineer in the house? How about an ESL Instructor?

We are building, brick by brick, a consolidation of power through an increasingly  fine-grained consolidation of data. We live in a world where a major portion of the US security apparatus has disappeared behind a curtain and is accountable, apparently, mostly to themselves. And a world where the idea of individual political rights to speak, meet, and organize are not widely acknowledged. The result of that intersection creates a problem that is not, I think, insignificant. The constraint on oppressive regimes has often been reach and we, in our work on information technologies, are eliminating some significant constraints.

What’s the solution? Is there one?

Privacy Pt 1: Youth Wants To Know
Privacy Pt 3: Is there an Engineer in the house?

Leave a Reply

Your email address will not be published. Required fields are marked *