Source: The New York Times
Some years ago an engineer at Google told me why Google wasn’t collecting information linked to people’s names. “We don’t want the name. The name is noise.” There was enough information in Google’s large database of search queries, location, and online behavior, he said, that you could tell a lot about somebody through indirect means.
The point was that actually finding out people’s names isn’t necessary for sending them targeted ads. It can probably lead to trouble, as Google’s own adventures in Wi-Fi snooping show. Even without knowing your name, increasingly, everything about you is out there. Whether and how you guard your privacy in an online world we are building up every day has become increasingly urgent.
“Privacy is a source of tremendous tension and anxiety in Big Data,”
says Danah Boyd, a senior researcher at Microsoft Research. Speaking last week at a conference on Big Data at the University of California, Berkeley, she said, “It’s a general anxiety that you can’t pinpoint, this odd moment of creepiness.” She asked, Iis this moving towards a society that we want to build?”
If conventional understanding chafes at the idea that our names are mere noise, consider the challenge in Ms. Boyd’s point about the self in a highly networked society. Take personal genetic data. “If I give away data to 23andMe, I’m giving away some of my brother’s data, my mother’s data, my future kid’s data.” For that matter, “Who owns the e-mail chain between you and me?”
Privacy is not a universal or timeless quality. It is redefined by who one is talking to, or by the expectations of the larger society.
In some countries, a woman’s ankle is a private matter; in some times and places, sexual orientations away from the norm are deeply private, or publicly celebrated. Privacy, Ms. Boyd notes, is not the same as security or anonymity. It is an ability to have control over one’s definition within an environment that is fully understood.
Something, arguably, no one has anymore.
“Defaults around how we interact have changed,” she said. “A conversation in the hallway is private by default, public by effort.
Online, our interactions become public by default, private by effort.”
There other ways in which we can lose control of our privacy now. By triangulating different sets of data (you are suddenly asking lots of people on LinkedIn for endorsements on you as a worker, and on Foursquare you seem to be checking in at midday near a competitor’s location), people can now conclude things about you (you’re probably interviewing for a job there) that are radically different from either set of public information.
What is to be done? Ms. Boyd has made a specialty of studying young people’s behavior on the Internet. She says they are now often seeking power over their environment through misdirection, such as continually making and destroying Facebook accounts, or steganography, a cryptographic term for hiding things in plain sight by obscuring their true meaning. “Someone writes, ‘I’m sick and tired of all this,’ and it gets ‘liked’ by 32 people,” she said. “When I started doing my fieldwork I could tell you what people were talking about. Now I can’t.”
That is a placeholder solution, and Ms. Boyd sees only one certainty for which we should prepare. “Regulation is coming,” she says. “You may not like it, you may close your eyes and hold your nose, but it is coming.”
The issue is what the regulation looks like, and how well it is considered. “Technologists need to re-engage with regulators,” she says. “We need to get to a model where we really understand usage.”
Right now, even among the highest geek circles, “we have very low levels of computational literacy, data literacy, media literacy, and all of these are contributing to the fears.”