easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews
The framework for little brothers and little sisters in a surveillance society
The problem with the Orwellian concept of Big Brother as the embodiment of centralized state based surveillance is that it’s too simple. It makes it easy to misunderstand the nuance and complexities of the surveillance state that we currently reside in.
For starters, our surveillance society is far more decentralized than Orwell anticipated. In many cases the surveillance is spontaneous and distributed, as happens during “public freak-outs” when observers instinctively take out their mobile devices to film the incident and then upload to their social media.
While there is a centralizing element, whether in the form of Facebook, the cloud, or people yelling out “World Star” as the incident happens, it is for all practical purposes the actions of individuals and not the apparatus of a centralized authoritarian state.
Hence instead of big brother, we have a surveillance society of little brothers and little sisters. Surveillance technology is accessible and proliferating.
Yet this emphasis on the grassroots elements of surveillance is also relatively simple and incomplete. Yesterday’s issue on Amazon’s new security drone provides a great example. While it is a surveillance tool for the homeowner, it also shares that surveillance with the larger digital monopoly. The two depend upon each other to engage in the surveillance.
Similarly the Amazon drone represents the end of the home as private space. Although it arrives after a long process in which the privacy of the home has been eroded.
Which brings us to the concept of Big Mother. Something that evokes how the home was subverted by surveillance, as well as a larger framework for the distributed surveillance proliferating throughout society.
Headed down to @cbcnews world to talk about Big Mother and the rise of parental surveillance with Nancy Wilson.
— Jesse Hirsh (@jessehirsh) February 24, 2009
One of the usual steps taken in producing this newsletter is a search on Twitter to see who else is talking or writing about the concept du jour. Always a pleasant surprise to find examples from the past where we’ve covered this ground.
It’s a reminder of the cyclical nature of technology (news) and how these concepts and ideas inevitably require time and repetition to take root. Although eleven years is a long time to come back to something.
Either Big Mother is only now ascending, or more likely, has remained relatively invisible, while among us, a feature and capability of this particular configuration of a surveillance state.
If we were to consider Big Mother as an extension of parental surveillance, then the children who were subject to this are only now starting to come of age. While TikTok does have some of their perspective, perhaps they’ve already found other ways to evade surveillance and as a result, our collective attention.
Alternatively, what if the concept of Big Mother is evolving or maturing. Extending from the micro level of the household to the macro level of the State. A means of integrating decentralized surveillance with the power and apparatus of the government?
In many respects, this is the story of the rise of algorithms, and the rise of algorithmic government.
My new story "The State Machine" is live on @Slate!
It's an exploration of benevolent (if imperfect) machine governance: "more Big Mother than Big Brother," as @divyastweets says: "attuned to the emotional and physical well-being of its citizens."https://t.co/zmY0nBS9iy
— Yudhanjaya Wijeratne ???? (@yudhanjaya) September 26, 2020
I am thankful, then, that the world we actually live in is not defined by robots’ mastery or servitude. The sky outside is white; cold, yes, but not blackened, not scorched, simply a monsoon season shaking itself down into spring. The wind carries with it the smell of etteriya flowers; little gifts from the cell tower trees, which carry orange jasmine DNA somewhere. There is a flock of little machines tending the one closest to my door—I think last night’s storm wasn’t too kind to it—and as I pass they move aside and point me in the direction of the bagel shop. One of them, very solemnly, holds up a little white flower.
What made it do that? The State Machine, knowing that I have barely stepped out of my flat after the breakup? That delicate symbiosis between machine input and well-intentioned social campaigns, setting forth in hard code a law that people who suffer must be taken care of?
A recurring theme of this newsletter is the kind of symbiosis that is possible between humans and machines (or humans and animals). This short story offers an interesting vision of such symbiosis between human and surveillance driven algorithm.
At University we’re taught how the State Machine and the Legal Atomism movement grew out of the need for bureaucracy to regulate an almost infinite number of interactions between diverse constituents while processing an ever-expanding amount of information. Indeed, an extension of this need, a push for greater efficiency through automation. The ruling class, whatever it happened to be, had to offer enough goods and services to the ruled to keep them happy. So, in the name of maintaining that happy equilibrium: Automate enough processes, do it well enough, and you end up with systems that interact well enough with one another to replace portions of a human bureaucracy. Let the process continue for a while and you end up with the State Machine: a system performing the supreme act of rationalization.
As a kind of alternative (future) history, the narrative takes the perspective of someone who has put in the effort to explore where this “State Machine” came from and how it evolved. In doing so it also evokes what kind of world emerged in response to this automated governance system. A system that is a combination of simulation and video game that depends upon user participation to learn, and thus finds ways to incentivize user participation.
Another chapter. A new Renaissance. And that was how the nascent State Machine ended up being bundled as a decision simulator into a massive aid grant to Sri Lanka, back in the day when countries were still a thing. Partly because its economy was crumbling, and partly because someone sitting in front of a New York skyline wanted to test the system before endorsing it. And, gamelike: What better way than to try it out on a microcosm? Sri Lanka was an island, and it had a smaller population than most cities today.
The protagonist, in exploring history, finds themself in trouble:
My supervisor is furious. Violence is taken very seriously. Thursday is the disciplinary hearing.
Well, hearing is a strong word. The whole process is handled by the State Machine. Out of respect for local standards there is a human jury, but they are anonymous, reviewing only the data; there are no appeals to file, no meetings to attend, only a series of quiet interviews, five minutes each, of everyone judged to be in my social web.
And then they get a different perspective on the State Machine:
People don’t know it, but the social contract around me has changed for a day, enforced by a million smartphones, cameras, login systems, payment gateways, search engines. A mobile medic drops by, stares at my room, treats my wounds, and leaves me with a mandatory dose of painkillers and several “voluntary” doses of mild suppressants. For the first time, the real invisible hand is revealed to me; the State Machine’s many subsystems stepping firmly and politely in my way, marking new boundaries.
This description matches my own hypothesis of the rise of Big Mother:
Here: the V102 bloc, invisible until now. The statist term. There is a time in all our histories when the State Machine, until now an instrument of the state, becomes the state; these dates are marked in stone and memory. But the code tree shows the truth. The states went under long before the formalities were sealed. I can only see a few branches at a time, but at this point various State Machines are interacting with themselves, very much like the automata that they are a part of, converging at a stable pattern, abstracting universal human needs as hyperparameters, weaving their own hegemonic superstructure.
This paragraph seemed particularly profound and relevant to our world:
History is a fabrication to preserve egos and social capital. The reality is that the State Machine swept over us all, turning would-be politicians into toothless, defanged puppets in a ceremonial democracy that everyone pretends to care about while the real work happens underneath.
How did we arrive at what we presently call the State Machine? When did we go from code and academia and failing nations to the all-encompassing, all-knowing, responsive automated government that runs our cities today? The one that can simultaneously understand the changing needs of its citizens, compile the Dynamic Constitution every week, and still spare time to hand out flowers to depressed students at their doorstep?
I’ve quoted a lot from this short story, in setting up the argument for this issue, however I highly recommend you read the full thing. I’ve left out the main drama and climax so as not to spoil it.
Yet the larger and concluding argument is that such a system, o/k/a Big Mother, is not only benign, but virtuous and beneficial. A sort of deity partially created by, inspired by, and in symbiosis with humans.
While I greatly enjoyed reading this short story, I was left with a sense of dread, that it was not just simple, but potentially blind to the consequence of the narrative, the consequence of arguing in favour of such a State Machine.
Read my essay, "Under the Gaze of Big Mother," which I wrote in response to the Future Tense story, "The State Machine," by @yudhanjaya!
— S.B. Divya ~ MACHINEHOOD ~ March 2021 (@divyastweets) September 27, 2020
The link above doesn’t work for me due to my aggressive ad blocking, but I do like to embed the author’s tweet leading to their work so y’all can explore the author if desired. However in this case I also include this tweet/link both because it works (for me) and because this Twitter account is a great source for this newsletter:
Under the Gaze of Big Mother
Yudhanjaya Wijeratne’s “The State Machine” shows the danger of trusting machines to be free of bias. https://t.co/NXS6XbTSIr
— Tactical Tech (@Info_Activism) September 28, 2020
Wijeratne’s fictional State Machine seems like a fairly benevolent overlord—more Big Mother than Big Brother. It’s attuned to the emotional and physical well-being of its citizens through a distributed network of real-world extensions (the emoji robots). Some of us might call them spies, but they’re only there for our best interests. The State Machine hands out flowers at first, and more strict measures when necessary, but it tries to maintain a “delicate symbiosis between machine input and well-intentioned social campaigns, setting forth in hard code a law that people who suffer must be taken care of.” It expresses neither fascist paternalism nor the blanket kindergarten rules of a nanny state. It’s maternal—nurturing but firm, like the Giving Tree, with an attempt to maintain healthy boundaries. Doesn’t sound so bad, right?
This dream crashes into reality via a basic fact: Biases and emotions are built into software systems by the fallible and illogical people who design them. We see this on a regular basis in the world of machine intelligence today. Whether the system is using supervised learning (where a human shows the machine what is right and wrong) or unsupervised learning (where the machine tries to categorize right from wrong on its own until the human says it’s done), the information provided to the machine must pass through a living gatekeeper. Sometimes that means the data fed to the machine is incomplete, as with recent problems in facial recognition. Sometimes it’s erroneous or lacks context, like the fun examples by Janelle Shane. Sometimes it’s a product of what that human expects or desires to see as a conclusion, as with racial biases reinforced in policing and lending. Artificial intelligence (much like fiction) reveals to us the inherent truths of our lives, not some greater glimpse of reality.
There’s no such thing as neutral data, nor unbiased algorithms. The issue is not whether it has a bias, but rather who’s agenda does it serve. When I use a tool, I want it to serve my agenda, my purposes. Similarly in a democratic society, I want my elected representative to serve my interests.
An artificial intelligence that can truly understand our behavior will be no better than us at dealing with humanity’s challenges. It’s not God in the machine. It’s just another flawed entity, doing its best with a given set of goals and circumstances. Right now we treat A.I.s like children, teaching them right from wrong. It could be that one day they’ll leapfrog us, and the children will become the parents. Most likely, our relationship with them will be as fraught as any intergenerational one. But what happens if parents never age, never grow senile, and never make room for new life? No matter how benevolent the caretaker, won’t that create a stagnant society?
The danger of convenience is the loss that it entails. Our muscles become atrophied when we fail to use them, the same is true with cognitive abilities.
— Ethan Zuckerman (@EthanZ) November 15, 2016
Automation can be empowering and helpful, but we should be careful about what we automated, and how we depend upon or normalize the powers we seem to gain as a result.
Be wary of home surveillance. Don’t be ‘Big Mother teaching children to accept Big Brother surveillance’ pic.twitter.com/eO1iwLKRwI
— Still Learning (@DonChaney2) October 28, 2017
Children generally aspire to be autonomous. To grow up and gain independence, so that they can do what they want to do.
The same is also true for people who need assistance. There’s a dignity that comes from autonomy, and our model for using automation to enable greater assistance should retain a focus on fostering independence and privacy.
What if "Big Mother" is watching you?
"Often, those who need the most support may have the least control over how and when their data is being used." pic.twitter.com/sbmBzLO467
— Azra Ismail (@azraism) October 28, 2019
Here’s another fascinating short story that explores the concept of Big Mother:
Gish Jen writes a short story on baseball, love and heartbreak in the age of surveillance: ”This wasn’t '1984'; Aunt Nettie wasn’t Big Brother. Indeed, some called her Big Mother." https://t.co/fZZLDpWLHG
— The New York Times (@nytimes) January 3, 2020
Finally my inspirtation for the Big Mother concept comes from Laurie Anderson and her Big Science album. In particular the lyrics from O Superman:
‘Cause when love is gone, there’s always justice.
And when justive is gone, there’s always force.
And when force is gone, there’s always Mom. Hi Mom!
So hold me, Mom, in your long arms. So hold me,
Mom, in your long arms.
In your automatic arms. Your electronic arms.
In your arms.
So hold me, Mom, in your long arms.
Your petrochemical arms. Your military arms.
In your electronic arms.
Note in the video below Laurie evokes the Orwellian head shot of Big Brother, but in this case as Big Mother. If the State Machine used Laurie Anderson as it’s avatar I might have trouble resisting its rule.
I sent today’s issue to the full email list of “free” subscribers as a bunch of you have joined that list now that substack makes it easy to find people who follow you on Twitter. Please note I rarely post to this list as the full Metaviews experience requires a subscription.