Privacy Pt 1: Youth Wants To Know

Discussion of privacy on the Internet, and, in particular, in social media, hit a local maxima a short time ago in the nym wars on Google+ against a background of increasing public concern.  Pronouncement by pundits and officials, some informed and most ignorant, rocks steadily on. None of that had pushed me to add my 2 cents to the mix.

What finally took me over the edge was hearing yet another 20 year old explain that his generation doesn’t have the same concern for privacy that we old folks so quaintly maintain. The implication is that, because they’re from a younger generation, the young are better adapted to current conditions.

There is another possible explanation, however. It follows.

First, let me get my on geezer persona.

Allen Ginsberg's Exorcism of the PentagonNow there, youngsters, I’m from a generation that routinely showed up naked in public and engaged in such low profile activities as chanting to levitate the Pentagon. This was dropped onto an America barely emerging from the 50’s and massively less receptive to living publicly. Nudity was performance art aimed at letting some sunlight into the cloister.

Allow me to assure you that my concerns are not an attachment to some arbitrary, bygone, meaningless convention like, for example, not ending a sentence with a preposition. They are, instead, the result of years of observation and balanced analysis…primarily of the government. More on that later.

You kids may have a different concept of the value of privacy but that concept is not different from the concept I held as a kid. Experience, one of a few advantages of aging, has made me wiser.

I’d argue that privacy concerns come down, not to how you feel about living in public, but how you feel about powerful institutions– governmental organizations, primarily, but any institution large enough to have significant information gathering resources–being able to have easy access to your life’s details.

I think providing that access is dangerous.

Rather than get into point/counter-point or personal biography, I’d like to perform a thought experiment.

Draw a axis of government benignity. Start in, say, Amsterdam and end in Pyongyang. Route it through 6 or 8 cities you think span the spectrum of openness. (Let’s pick Berlin, Beijing, Lagos, Sao Paulo, Moscow, San Francisco, Tehran, Istanbul, and London for the purposes of this experiment.) Put them on the axis wherever you think they belong. Now, imagine yourself in living a Amsterdam. Assume you and your tribe share thoughts, interesting books and websites, music, purchases, shops and restaurants via a social media or two. Sometimes you flirt; sometimes you trade ideas and opinions; sometimes you blow off steam.

(Exclude for the purposes of this experiment the information streams created by your cell phone and credit card.)

Move the slider along until having all that information public becomes a significant threat to your well-being…until you cross a line where your behavior is now suspect and where simultaneously you have revealed who you associate with and much of your daily routine. If you get comfortably all the way to Pyongyang, you need to get a life. Commonly you’re in danger of loss of income, harassment, discrimination, imprisonment, or disappearance at some point midway along the line for any of multiple reasons.

Of course, you say, you’re not in Pyongyang but near Amsterdam on the line…safe perhaps in San Francisco with me, and these concerns aren’t appropriate. Or are they?

At issue? These media are primarily built here but deployed globally. What seems benign locally becomes a tool for the police state with a slight shift of context. Regardless of local comfort, our systems would better be measured on a global yardstick.

Specifically, as the creators, early adopters, critics and evangelists for these systems, we have responsibility to take predictable consequences quite seriously as we design, create, implement, and sell. We are building history, brick by brick, and broad current flow through us and our shapings.

We need to take that seriously.

At least now and then.

It will be of general benefit if we lay down cover for our brothers and sisters that are under genuinely under siege. Perhaps they want to practice Qi Kung in Beijing or discuss their Armenian heritage in Istanbul…or draw political cartoons in Amsterdam.

The issues aren’t necessarily easy to engineer. We need to wrestle, for example, with the interface between criminality and dissent, and the interplay of persona and authenticity. But it’s up to us to do that rather than ignore complexity and the likely paths into the future. I have some ideas that might help in this discussion. Probably you do, too.

I’m urging us to design and discuss privacy in a wider scope–spatially, temporally, morally–and remove it from a discussion of personal preference.

In Brian Eno’s terms, we need to think in a broader here and a longer now.

Privacy Pt 2: Python Politics: the Economics of Knowledge Accumulation
Privacy Pt 3: Is there an Engineer in the house?

PS – A few notes in passing.

The US is not necessarily a safe haven.

Consider http://www.nytimes.com/2007/12/23/washington/23habeas.html?ref=us in light of the recent National Defense Authorization Act. Or police raids to preemptively undermine protests at the 2008 Republican convention: http://www.salon.com/2008/08/30/police_raids/

You may not be a Judi Bari (they arrest you and search everywhere you hang out after someone bombs your car) or organizing widespread protest in Minneapolis (the show up the week before and take all your stuff) but note: systems integration is changing the game for law enforcement everywhere. It might not be J edGar Hoooover you need to worry about but some local middle-aged portly DA with too tight shoes, a sullen teenager, a nasty mortgage, and a depressed spouse that’s pissed off because some 20 year old is driving a car that costs his yearly salary

Finally, here’s a bit of gratuitous Chuang Tzu – the topic is gathering up treasures for thieves : http://oaks.nvg.org/zhuangzi10-.html

Privacy Pt 2: Python Politics: the Economics of Knowledge Accumulation

clothed man arrested by naked guys
“A constricting snake like a boa or a python kills its prey by suffocation. It uses the momentum of its strike to throw coils around its victim’s body. Then, it squeezes. Every time the prey exhales, the snake squeezes a little more tightly. Soon, the victim can breathe any more.” ref

Essentially, then, constrictors kill not by crushing but by taking in the slack and not giving it back. I view this as relevant to the topic at hand.

History: the Internet Wants to Be Free.

We seem to have generated a cultural lock-in that demands a free Internet. This is the result of a combination of factors: successful initiatives, ideological leanings (sometimes based on the semantic ambiguity between free as unregulated and free as without cost), and significant self-delusion.

There is a component of this freedom that is real. The freedom exist in a commons collectively built and shared. It exists in the group effort to build standards and protocols to connect us and to share and display information. It exists in the collection of individual visions, not always mutually compatible, that maintain and extend this effort. It exists in our ability to design technical hacks to undermine systems of control.

Wikipedia is a good exemplar of this Internet.

Beyond that, it’s not free at all.

Our vision is embodied in components all of which have both capital and maintenance costs.

The initial freedom was built on simple design hacks and the low bandwidth required to move text around. It sat on top of large computing projects in academia and government and survived on crumbs snatched off the table. You might have to pay a bit for your hook to the the infrastructure but content was provided mostly without cost. This was the world of bulletin boards, irc, usenet groups, email, text based MUDs.

This model has been extended with increasing elaborate ‘free’ content and services: Google, youTube, Flickr, facebook, Pandora,  Spodify, Twitter and so on. Meanwhile, temporary successes in the paid model, eg AOL, generally seem to erode back to free.

This apparent continuity masks a simple fact: the old model didn’t scale. We benefit from increasing costly products. A new funding source…advertising, of course…by and large pays the bill.

Google is the exemplar, here.

Which brings us to the common wisdom: if the product appears free, then you’re the product.

The true coin of this transaction is information about us individually and collectively.

The economics of targeted advertising is pretty simple conceptually. Information is gathered on you to give you advertising you’re likely to respond to. Or perhaps to provide you a more tailored service that will make you more engaged and hence bring you to the advertising more frequently.

With Google this was an add-on that went from humble beginnings. Now the business plans of start-ups are often designed specifically to elicit information. Products are built around data mining and targeting advertising.

We gain significant benefit from this model and here we, perhaps, become complicit.

 This march toward greater public visibility has generated other elements of common wisdom which are more suspect.

You often hear the following: “Privacy is dead. Things are different than in the past. We need to get over it and adjust to new realities. Privacy is already fairly illusory therefore upset is unjustified; just look at the targeted direct mail of the past” And so on.

To my mind, an appeal to the hopelessness of implementing change regardless of that changes desirability somehow lacks punch. We live in a world that’s changing all the time. We live in a world in which an improved technical solution or compelling idea can turn things upside down in considerable less than a decade. And we are inventing that technology every day.

There is a danger that I think it is our responsibility to recognize and attempt to address.

My argument is —

  1. There’s an unremitting economic pressure to create a higher and higher resolution picture of you, what you do, and what you might potentially consume.
  2. We’ll hit a point (and are there already in some locations) where you’ll stand out as suspicious by not being transparent. Given this fine grained picture, not being highly visible will become increasingly suspicious. The blurry individuals then stand out as very likely criminals or revolutionaries.
  3. The most likely to succeed in hiding from scrutiny behind a false front is the deliberate criminal with a carefully constructed public identity. The most likely to fail is the accidental revolutionary…the well meaning public citizen that is compelled to move into some sort of rebellion.
  4. Here’s the NYRB on Chinese dissident Fang Lizhi:

Fang’s path through life observed a pattern that is common to China’s dissidents: a person begins with socialist ideals, feels bitter when the rulers betray the ideals, resorts to outspoken criticism, and ends in prison or exile. Liu Binyan, Wang Ruowang, Su Xiaokang, Hu Ping, Zheng Yi, Liu Xiaobo, and many others have followed this pattern.

Let’s return to a police action in Minnesota cited in the PS to Part 1.

I suspect that these raids proved highly effective in disrupting the protest and that it would have been less so had the naive folks raided been up to actual criminal activity and taken steps to keep backup materials and tools hidden away somewhere. In other words, if they had been the folks that justified a disruptive police action then that action would likely have been ineffective.

Is there an Engineer in the house? How about an ESL Instructor?

We are building, brick by brick, a consolidation of power through an increasingly  fine-grained consolidation of data. We live in a world where a major portion of the US security apparatus has disappeared behind a curtain and is accountable, apparently, mostly to themselves. And a world where the idea of individual political rights to speak, meet, and organize are not widely acknowledged. The result of that intersection creates a problem that is not, I think, insignificant. The constraint on oppressive regimes has often been reach and we, in our work on information technologies, are eliminating some significant constraints.

What’s the solution? Is there one?

Privacy Pt 1: Youth Wants To Know
Privacy Pt 3: Is there an Engineer in the house?