Happy Mother’s Day

I feel like celebrating…that’s progress.

This day has been hard for the last few years. I’m happy to report this year feels different. Lighter. But because I am otherwise occupied this beautiful Sunday, this semester’s term paper will do double duty here on the blog. The culmination of my ethical education is below.


Is Privacy Still Possible?

Recent advances in technology make it possible for one to track every moment of their day digitally, down to each individual heartbeat if desired. In turn, questions about the bounds of privacy have emerged with an added wrinkle. The concept of privacy, nearly impossible to pin down with a concise definition given its subjective nature, has previously been defined by declaring instead what is public. Both the public and private spheres have a fair amount of nuance, and by clearly declaring the “public domain” we could more easily decern the private one. Now, with social media, health trackers, GPS, security cameras, and a host of other life measuring technology, a clear demarcation between what is private and public has become nonexistent. Your phone, containing a public marketplace, follows you all the way inside your house and can theoretically transmit whether you are standing in your kitchen or your bedroom to any interested party trying to sell frying pans and bedsheets. Some call this new era the surveillance economy. Corporations collect profits from the data derived from our private lives. While the law catches up on deciding if and how corporations and governments can monitor our daily data, I am left wondering if the concept of a right to privacy exists as it used to. Can we still claim a right to privacy when the space between our public and private spheres has been fundamentally erased? Under the emerging circumstance if we still want to claim it, we’ll need to define privacy differently. In her theory of contextual integrity, Helen Nissenbaum defines “a right to privacy [as] neither a right to secrecy nor a right to control but a right to appropriate flow of information.” (2010) If we are willing participants in the age of the internet (googling who voices an animated character from our seat in the theater, sharing an image of our lunch on social media, and tracking our daily jog via GPS), is it reasonable to expect our privacy should still be protected?  

A degree of privacy is needed for individuals to control who can access their physical bodies, abilities, and sensitive personal information. All are essential to maintaining human dignity, a moral necessity according to Kantian ethics. People have the right to control how and if they present themselves to others. It should be up to the individual to determine how she wants to be, think, and act. (Sax, 2018) Powerful predictive algorithms monitoring user behavior make it possible for platforms to sway their users’ thoughts and actions in one direction or another (i.e. you scroll to a link for a sleep aid at 2 a.m.) interfering with this basic freedom. Additionally, a public/private distinction is needed to determine an appropriate dominion for government authority. People tend to alter their behavior if they are aware they are being monitored. The possibility of democratic self-determination cannot exist without protection of privacy. (Roessler & DeCew, 2023) Eagerness to share our lives publicly, combined with often confusing and ever-changing privacy controls has led to apathy among users. People care about protecting their data privacy but have an inadequate understanding of how to do it and so they do nothing (Hargitta and Marwick, 2016) while more and more of the private realm is relinquished.

In many of the modern relevant cases, users are willingly entering information about themselves, either by content uploads or interactions, into the servers than run their software. It’s reasonable to expect that personal data should not earn corporate profits from sales to third parties, but it is less clear that they shouldn’t earn profits by honing their own algorithm. If a company can use the data users willingly create on their platform, app, etc. to effectively improve results for customers buying ad space, it could be considered part of the business model. In Richard Posner’s economic critique of privacy, he claims it should only be protected when a violation would reduce its value. (Roessler & DeCew, 2023) Corporations could argue they are adding value to the user experience by enhancing personalization that benefits their wellbeing. The counter argument there is whether privacy’s value is greater than a really cute pair of sandals. The prevailing argument for privacy of information is that everyone has the right to be left alone. (Warren and Brandeis) Users should be able to expect a certain level of autonomy when online without threat of manipulation. Perhaps the balance here is how obviously advertisements or other behavioral adjustments are signaled to the user.

Privacy is not defined by all as the appropriate flow of information. Communitarian thought centers privacy around specific practices relevant to the community and is less concerned with the personal privacy of the individual. (Roessler & DeCew, 2023) This type of thinking helps make the case for government surveillance of public exchanges that may insight hate or violence toward specific faiths, in the interest of public safety. And Reductionists claim privacy violations are defined by an imposition of emotional distress or damage to property interests (Roessler & DeCew, 2023) meaning innocuous data collection or surveillance would be fair game as the user is unlikely to be emotionally harmed by targeted advertising.

Ruth Gavison says an individual enjoys perfect privacy when he is completely inaccessible to others (1980) but modern technology accesses us almost everywhere we go. The private sphere has in someways absorbed the public one, and so defining privacy by the flow of information between the two is, I think, a smart way to go about it. “There are laws that govern health records, educational records, and even video rentals, but no laws that specifically protect social media profiles or health data collected by wearables or fitness apps…the current patchwork of privacy regulations in the United States is inadequate and remains one step behind technological development.” (Hargitta and Marwick, 2016) Often the claims around the value of privacy are normative. In practice, it is paradoxical to expect a government or corporation to be both responsible for the privacy of data acquiesced by adoption of a use agreement (which are so convoluted they are ethical dilemmas in themselves) and also expect that data to be obstructed from the view of said government or corporation. The legislation needed should specify the actions allowed concerning collected data and a requirement to also provide protections against any possible threat to data privacy from bad actors. Users should be provided an option for an algorithmically neutral experience to maintain a free, autonomous life.

“The value of privacy should be understood in terms of its contribution to society.” (Solove, 2008) The reason we value privacy is that it allows for intimacy in our personal relationships with others, which “essentially depend on the ability to share certain information which we would not share with those not in the relationship.” (Roessler & DeCew, 2023) With all the data of our personal lives online, we’ve made intimacy less possible in reality. As a relational species, we gain self-consciousness only through a process of mutual recognition. (Gheaus, 2022) Our intimate personal relationships are necessary for collective human flourishing. Considering this, privacy becomes a community good acknowledging benefits to both the individual and the collective and create incentive to preserve privacy wherever possible. Nissenbaum’s theory of contextual integrity breaks down informational data by five parameters: sender, recipient, data subject, transmission principle, and type of information that intend to remove some ambiguity surrounding data privacy. (Grodzinsky, 2011) With such a system in place, data could be more easily managed or obscured depending on its level of privacy sensitivity. For example, data from your heart rate monitor is shared with your physician but not with advertisers. Users could rely on these metrics to moderate the flow of their firehose of data. Until such a system is a requirement for data collection, users remain on the hook for navigating data privacy on their own from inside an environment designed to obfuscate.

As technological monitoring across large scale environments (smart cities) and individual specificity (facial recognition, and biometrics) becomes even more ubiquitous, the threat to our privacy rights will also grow. Without serious consideration of and restrictions to the sovereign needs of both groups and individuals, violations of privacy via our data will continue to undermine our values and autonomy in the online space at the detriment of our society.


Bibliography
  • Gavison, Ruth, 1980, “Privacy and the Limits of Law”, Yale Law Journal, 89(3): 421–471.
  • Gheaus, Anca, “Personal Relationship Goods”, The Stanford Encyclopedia of Philosophy (Winter 2022 Edition), Edward N. Zalta & Uri Nodelman (eds.), URL = <https://plato.stanford.edu/archives/win2022/entries/personal-relationship-goods/&gt;.
  • Grodzinsky, F. and H. Tavani. “Privacy in “The Cloud”: Applying Nissenbaum’s Theory of Contextual Integrity.” ACM SIGCAS Computers and Society 41.1 (2011): 38-47.
  • Hargittai, Eszter and Alice Marwick, 2016, “‘What Can I Really Do?’ Explaining the Privacy Paradox with Online Apathy”, International Journal of Communication, 10: 3737–3757 (article 21). 
  • Nissenbaum, Helen, Privacy in Context: Technology, Policy, and the Integrity of Social Life, (Palo Alto, CA: Stanford University Press, 2010).
  • Roessler, Beate and Judith DeCew, “Privacy”, The Stanford Encyclopedia of Philosophy (Winter 2023 Edition), Edward N. Zalta & Uri Nodelman (eds.), URL = <https://plato.stanford.edu/archives/win2023/entries/privacy/&gt;.
  • Sax, Marijn, 2018, “Privacy from an Ethical Perspective”, in van der Sloot and de Groot 2018: 143–172. doi:10.1515/9789048540136-006
  • Solove, Daniel J., 2008, Understanding Privacy, Cambridge, MA: Harvard University Press.
  • van den Hoven, Jeroen, Martijn Blaauw, Wolter Pieters, and Martijn Warnier, “Privacy and Information Technology”, The Stanford Encyclopedia of Philosophy (Winter 2024 Edition), Edward N. Zalta & Uri Nodelman (eds.), URL = <https://plato.stanford.edu/archives/win2024/entries/it-privacy/&gt;.
  • Warren, Samuel D. and Louis D. Brandeis, 1890, “The Right to Privacy”, Harvard Law Review, 4(5): 193–220.

Leave a comment