easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews
You’re only as strong or smart as the people around you
We tend to use the analogies and frames of the past as a means of understanding the present. However sometimes these outdated frames no longer fit, and skew our perception by keeping our focus in the past rather than the present.
Is that currently the case when it comes to our privacy? Are we stuck regarding and protecting privacy through the lens of the individual, when the more powerful and relevant context is at the collective level?
Is our battle to protect our privacy as individuals irrelevant and futile? Is our collective identity responsible for our erosion of privacy? Similarly does it provide the means by which we might reclaim it?
The individual certainly has a role, but we take for granted just how dwarfed a single person becomes in the massive oceans of big data and algorithmic media. Instead we should acknowledge that the secrets of algorithmic deduction are not based on getting to know us as individuals, but rather recognizing the context or group dynamics that shape our individual preferences and interests.
Posted to SSRN: Anuj Puri “A Theory of Privacy” https://t.co/ktr8KhFg37
— Privacy+Security Academy (@privsecacademy) September 21, 2020
The abstract to this paper is rather dense, I’m going to try and parse it:
In the age of Big Data Analytics and COVID-19 Apps, the conventional conception of privacy that focuses excessively on the identification of the individual is inadequate to safeguard the identity and autonomy of the individual. An individual’s autonomy can be impaired and her control over her social identity diminished, even without infringing the anonymity surrounding her personal identity. A century old individualistic conception of privacy that was designed to safeguard a person from unwarranted social interference is incapable of protecting her autonomy and identity when she is being targeted on the basis her interdependent social and algorithmic group affiliations.
This is the central premise. That the individual as we know it has been replaced with the networked individual, who does not exists as an island, but rather part of a larger social fabric that influences their attention and interests.
In order to overcome these limitations, in this paper, I develop a theoretical framework in form of a triumvirate model of group right to privacy (GRP), which is based on privacy as a social value (Pv). An individual has an interest in protecting her social identity arising out of her participation in social groups. The panoptic sorting of individuals by Big Data Analytics for behavioral targeting purposes gives rise to epistemic bubbles and echo chambers that impede the formation of an individual’s social identity.
Here’s another interesting argument: do the groups we belong to, whether deliberate or not, limit our ability to grow and evolve as individuals?
I construct the formulation of GRP1 to protect an individual’s interest in her social identity and her socially embedded autonomous self. Thereafter, I emphasize an individual’s right to informational self-determination and against algorithmic grouping in GRP2. Lastly, I highlight instances where an organized group may be entitled to privacy in its own right as GRP3.
Also worth noting is the idea that we are put into groups by algorithms, as a means of sorting and ranking, and not only are we often not aware of these classifications, but what if said groupings influence not only how we are treated (by algorithms) but also who we are able to become?
I develop a Razian formulation to state that the constant surveillance and monetization of human existence by Big Data Analytics is an infringement of individual autonomy. I highlight that the violation of GRP subjects an individual to behavioral targeting including hyper-targeted political advertising and distorts her weltanschauung.
The question of whether our worldview can be distorted remains contentious, but there’s ample reason to believe it is possible and currently actively being attempted. Yet this concept of the GRP or group right to privacy is interesting, as it changes the dynamic in which hyper-targeted messaging takes place. Both in identifying which buckets we’re being put into, while also requiring greater group consent for those collectives to be targeted.
As regards the COVID-19 Apps, I assert that the extraordinary circumstances surrounding the pandemic do not provide an everlasting justification for reducing the identity of an individual to a potential disease carrier. I argue that the ambivalence regarding existence of surveillance surrounding an individual’s social identity can leave her in a perpetual state of simulated surveillance (simveillance). I further assert that it is in the long-term best interests of the BigTech corporations to respect privacy.
There is however a paradox at work with regard to pandemic surveillance. It helps argues that authors premise that our privacy is fundamentally collective. That data about us as individuals is far less valuable than social data that locates us in relation to others. Contact tracing or exposure notification helps illustrate this in ways that may not have been as visible.
In conclusion, I highlight that our privacy is not only interdependent in nature, it is existentially cumulatively interlinked. It increases in force with each successive protection. The privacy challenge posed by COVID-19 Apps has helped us realize that while limited exceptions to privacy maybe carved out in grave emergencies, there is no justification for round the clock surveillance of an individual’s existence by Big Data Analytics. Similarly, the threat to privacy posed by Big Data Analytics has helped us realize that privacy has been wrongly focusing on the distinguishing aspects of the individual. It is our similarities that are truly worth protecting. In order to protect these similarities, I formulate the concept of mutual or companion privacy, which counter-intuitively states that in the age of Big Data Analytics we have more privacy together rather than individually.
While this may have been the longest paragraph abstract I’ve read in a long time, it does summarize the arguments found in this paper rather convincingly.
It acknowledges the logic of algorithms and machine learning which is to situate and understand our activities in the context of others. Our personal information depends upon interaction with others in order to make sense.
As with many aspects of our lives, the pandemic may have permanently altered our perception of the interdependence of our personal information.
Authorities in China, Israel, Russia, the US, and EU have either secured access to consumer cellphone location data, or they are trying to get it.
How powerful is your right to data privacy when we require a collective response to a pandemic?https://t.co/VRQwzNa9T4
— ian bremmer (@ianbremmer) March 25, 2020
Especially in light of the disproportionate impact of the illness on some communities compared to others:
Thinking about privacy as an individual right can obscure its collective protections. Sometimes, you fight for privacy to protect the usual targets of public and private persecution: marginalized communities. https://t.co/ls9gRnXQDP
— Ángel S. Díaz (@AngelSDiaz_) April 17, 2020
And that also means that the responsibility for how we got here rests on our collective shoulders:
On shame: Our collective privacy problem is not your fault. Companies rely on people not absorbing all the legal info given and use manipulative design to wheedle people into spending more time, more money, or providing more data than they intended. https://t.co/gU6jyJD3Z2
— Matthew Burpee (@MatthewBurpee) January 5, 2020
It’s unreasonable to expect individuals to read and comprehend privacy agreements let alone the larger implications of how their data (and our collective) data is being used.
This is acknowledgment of what many of us have been saying for a long time, and this applies to Apple and Google, too. The individualized "consent" regime is not about choice, but about the defaults and obscurity of the underlying reality of our tech surveillance environment. https://t.co/4WEk4tS1f1
— zeynep tufekci (@zeynep) August 27, 2020
Data privacy is not something that can be effectively regulated at the individual level because it is something akin to air pollution, a public good that requires a collective response. That's why GDPR in Europe doesn't work. From a piece I wrote in 2018. https://t.co/OFWAcpJvkS pic.twitter.com/zPo8gHfioi
— zeynep tufekci (@zeynep) August 27, 2020
Perhaps as we move forward, and better understand or appreciate the collective nature and dynamics of our identities and information, we’ll move towards measures and frameworks that emphasize the collective good in addition to the individual.
I enjoyed talking to @martintisne about his new paper ‘The Data Delusion’ in which he makes the case for stronger laws to address the collective harms of dataprocessing, while current regulations consider individual rights primarily ↘️ https://t.co/fi2FKFkvZr
— Marietje Schaake (@MarietjeSchaake) July 14, 2020
It does strike me that the easiest example of collective privacy and information is the family and what a family knows about each other. Mind you this is why individual privacy often matters, as family dynamics can be toxic. However in times of crisis, it is often the family that people turn to and trust.
Perhaps a benefit of exploring the collective or group based dynamics of information and privacy is it will help articulate the role of trusts or collectives as surrogates or replacements to families for digital communities and identities.
Who is your digital pack or hacker collective that has your back? #metaviews