If we want to protect privacy, we should be more clear about why it is important.
Neo_II/Flickr/Rebecca J. Rosen
Our privacy is now at risk in unprecedented ways, but, sadly, the legal system is lagging behind the pace of innovation. Indeed, the last major privacy law, the Electronic Communications Privacy Act, was passed in 1986! While an update to the law -- spurred on by the General Petraeus scandal -- is in the works, it only aims to add some more protection to electronic communication like emails. This still does not shield our privacy from other, possibly nefarious, ways that our data can be collected and put to use. Some legislators would much rather not have legal restrictions that could, as Rep. Marsha Blackburn stated in an op-ed, "threaten the lifeblood of the Internet: data." Consider Rep. Blackburn's remarks during an April 2010 Congressional hearing: "[A]nd what happens when you follow the European privacy model and take information out of the information economy? ... Revenues fall, innovation stalls and you lose out to innovators who choose to work elsewhere."
Even though the practices of many companies such as Facebook are legal, there is something disconcerting about them. Privacy should have a deeper purpose than the one ascribed to it by those who treat it as a currency to be traded for innovation, which in many circumstances seems to actually mean corporate interests. To protect our privacy, we need a better understanding of its purpose and why it is valuable.
That's where Georgetown University law professor Julie E. Cohen comes in. In a forthcoming article for the Harvard Law Review, she lays out a strong argument that addresses the titular concern "What Privacy Is For." Her approach is fresh, and as technology critic Evgeny Morozov rightly tweeted, she wrote "the best paper on privacy theory you'll get to read this year." (He was referring to 2012.)
At bottom, Cohen's argument criticizes the dominant position held by theorists and legislators who treat privacy as just an instrument used to advance some other principle or value, such as liberty, inaccessibility, or control. Framed this way, privacy is relegated to one of many defenses we have from things like another person's prying eyes, or Facebook's recent attempts to ramp up its use of facial-recognition software and collect further data about us without our explicit consent. As long as the principle in question can be protected through some other method, or if privacy gets in the way of a different desirable goal like innovation, it is no longer useful and can be disregarded.
Cohen doesn't think we should treat privacy as a dispensable instrument. To the contrary, she argues privacy is irreducible to a "fixed condition or attribute (such as seclusion or control) whose boundaries can be crisply delineated by the application of deductive logic. Privacy is shorthand for breathing room to engage in the process of ... self-development."
What Cohen means is that since life and contexts are always changing, privacy cannot be reductively conceived as one specific type of thing. It is better understood as an important buffer that gives us space to develop an identity that is somewhat separate from the surveillance, judgment, and values of our society and culture. Privacy is crucial for helping us manage all of these pressures -- pressures that shape the type of person we are -- and for "creating spaces for play and the work of self-[development]." Cohen argues that this self-development allows us to discover what type of society we want and what we should do to get there, both factors that are key to living a fulfilled life.
Woodrow Hartzog and Evan Selinger make similar arguments in a recent article on the value of "obscurity." When structural constraints prevent unwanted parties from getting to your data, obscurity protections are in play. These protections go beyond preventing companies from exploiting our information for their financial gain. They safeguard democratic societies by furthering "autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power."
In light of these considerations, what's really at stake in a feature like Facebook's rumored location-tracking app? You might think it is a good idea to willfully hand over your data in exchange for personalized coupons or promotions, or to broadcast your location to friends. But consumption -- perusing a store and buying stuff -- and quiet, alone time are both important parts of how we define ourselves. If how we do that becomes subject to ever-present monitoring it can, if even unconsciously, change our behaviors and self-perception.
In this sense, we will be developing an identity that is absent of privacy and subject to surveillance; we must decide if we really want to live in a society that treats every action as a data point to be analyzed and traded like currency. The more we allow for constant tracking, the more difficult it becomes to change the way that technologies are used to encroach on our lives.
Privacy is not just something we enjoy. It is something that is necessary for us to: develop who we are; form an identity that is not dictated by the social conditions that directly or indirectly influence our thinking, decisions, and behaviors; and decide what type of society we want to live in. Whether we like it or not constant data collection about everything we do -- like the kind conducted by Facebook and an increasing number of other companies -- shapes and produces our actions. We are different people when under surveillance than we are when enjoying some privacy. And Cohen's argument illuminates how the breathing room provided by privacy is essential to being a complete, fulfilled person.
More From The Atlantic
- Music Sales Are Growing For the First Time This Century: Here's Why
- The 10 States That Should Be Most Afraid (and Least Afraid) of the Sequester's Military Cuts
- Marissa Mayer Is Wrong: Working From Home Can Make You More Productive