Monday, January 11, 2010

Why Facebook is Wrong: Privacy Is Still Important

Facebook founder Mark Zuckerberg told a live audience this weekend that the world has changed, that it's become more public and less private, and that the controversial new default and permanent settings reflect how the site would work if he were to create it today. Not everyone agrees with his move and its justification.



Has society become less private or is it Facebook that's pushing people in that direction? Is privacy online just an illusion anyway? Below are some thoughts, based primarily on the pro-privacy reactions to Zuckerberg's statements from many of our readers this weekend. Though there is a lot to be said for analysis of public data (more on that later), I believe that Facebook is making a big mistake by moving away from its origins based on privacy for user data.


Sponsor



In Facebook's early days, and for the vast majority of the site's life, its primary differentiator was that your user data was only visible to other users that you approved friend requests from. As of mid-December, Facebook users were no longer allowed to hide from the web-at-large some information including their profile photos, list of friends and interests in the form of fan pages they followed. Text, photo and video updates shared on the site have always been by default private (friends only) but if you'd never changed your privacy settings before last month, then Facebook suggested you switch them to make those updates publicly visible to everyone. That became the new default.



Here are three reasons why making some of this data public by requirement and some public by default is the wrong thing to do and why society is not in fact changing the way that Zuckerberg claims it is.



Evolving Preferences Don't Justify Elimination of Choice



Mark Zuckerberg might be right, people probably are becoming more comfortable telling the world at large about more and different parts of their lives. Why does that mean it's ok to take away peoples' choices and force them to make public some of their information all the time? That just doesn't make sense.



Privacy is a fundamental human right and while that may seem less true when we're operating on corporate turf like Facebook, Facebook used to be based on privacy. Why give it up so easily? (Isn't it a cause for concern that so much of our civic interaction now goes on through this and other corporate channels?)



It's very hard to believe that the hundreds of millions of mainstream Facebook users are wanting to throw their privacy out the window - and if Facebook believes they are, why not as them clearly?



Privacy Doesn't Just Mean Secrecy



This Summer we wrote about the academic research of University of Massachusetts-Amherst Legal Studies student Chris Peterson, who argues that an accurate and contemporary understanding of privacy is based more on the integrity of context than on absolute secrecy. Peterson tackles the contemporary reality of privacy on Facebook in a very readable draft thesis paper titled Saving Face: The Privacy Architecture of Facebook (PDF).



Peterson argues that the idea that anything published ought to be understood as intended for public distribution is an antiquated understanding from the era when publishing was expensive and required a lot of effort. The opposite is true today, it's free and easy to publish - so information at different levels of appropriateness for public eyes is being published. Why not support that?



"There was of course no way of knowing whether you were being watched at any given moment... It was even conceivable that they watched everybody all the time.

But at any rate they could plug into your wire whenever they wanted to. You had to live - did live, from habit that became instinct - in the assumption that every sound you made was overheard, and except in darkness, every movement scrutinized." - George Orwell, 1984

Instead of what Facebook is doing, Peterson says that a more appropriate understanding of privacy today is based on context. We expect our communication to go on in an appropriate context (no drinking in church or praying in the bar) and we expect to understand how our communication will be distributed.



If a college friend took photos of you drinking in a bar and showed them off to people in church, you might feel your privacy has been violated in both appropriateness and distribution. The bar is a public place, though, and not completely secret. Thus the need for a more sophisticated understanding of privacy that is more than mere secrecy.



By pushing your personal information and conversation through activity updates fully into the public, Facebook is eliminating any integrity of context that these conversations would naturally have. Posted updates can be directed only to limited lists of Facebook contacts, like college buddies or work friends, but that option is buried under more public default options and much of a user's activity on the site is not subject to that kind of option.



Facebook founder Mark Zuckerberg used to say that people would share more information if they felt comfortable knowing that it would only be visible to people they trusted. He told me in an interview two years ago that users who wanted to couldn't take their data off of the site because privacy control "is the vector around which Facebook operates." Now apparently, he's changed his mind. This weekend I argued that his justification for the new stance is not credible.



Many People Need Control Over Personal Information



Do people no longer need to keep access to some of their personal information online limited just to trusted friends? Facebook seems to be arguing that they don't.



There is a long list of people who clearly do, though, including: people who've escaped abusive relationships, people with marginalized religious or sexual preferences, people who fear losing their jobs or who've been pushed around by bullies throughout their lives. That list adds up to a very large portion of the world, in fact. The group of Ivy League elites who run Facebook might think there's no reason to be able to control access to their personal information, but many of them are less socially vulnerable and have less need to control their personal information.



Consider this comment left by one of our readers in response to Zuckerberg's statement.

"As a person who is being stalked for being an innocent bystander in a child custody case, I can tell you that losing my choices over what is searchable or not is huge. I have nothing to hide nor be ashamed of but the loss of choice for my privacy has hit home in a poignant manner."



Stories like that are far more common than you might think and removing user control over what's public removes the ability for millions of people to safely participate on Facebook.



More than millions, tens or hundreds of millions of people around the world have reason to limit visibility of their personal information from the web but still want to be able to share that information with trusted contacts. Facebook became a huge success on that premise and ought to be able to continue to thrive without doing a 180 degree turn on privacy.



Coming soon: The positive side of Facebook data made public. Hint.


Discuss





http://bit.ly/6pyldc

No comments:

Post a Comment