Defund Big Tech, Refund Community

easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews

Anti-Trust is not enough

I got an email last week from a long time fellow traveler, Andrew Clement, pointing me to a new paper that he had participated in producing.

It was the result of an interesting collaboration among a group of scholars and activists.

<

Together they’ve assembled a really strong argument as to why anti-trust action is not enough and why society is owed a refund, both in return for public investment in technology and the systemic use of our personal information.

The inspiration for the frame their using are the calls to defund the police, combined with the larger recognition that the Internet is and should be a public utility. Here’s a quote from their intro paragraph (which I am editing by turning into multiple paragraphs):

We are inspired by renewed calls to Defund the Police in the United States, which have reinvigorated vital debate regarding the funding of police departments, who is actually served by them, and what forms of historical injustice are perpetuated by current institutions of policing and incarceration. In the context of the abolitionist movement, to defund means to invite local and regional communities to decide how to redirect the disproportionate funds now invested in enforcement and imprisonment to support alternative, more holistic forms of well being and public safety infrastructure.

In the spirit of that movement, we adapt some of its key concepts to the domain of public/community information and communications (ICT) infrastructures, particularly those now dominated by Big Tech. Our proposal is grounded on a key premise: to redirect Big Tech ’s excessive revenue flow, we must transform the conditions and funding structures that enable it.

The aim is to free up resources to support a wide range of socially beneficial ends, not least community-based and community-oriented initiatives to develop digital infrastructures that better serve the public interest. While we are not calling for the demise of Big Tech, we are calling for radical reform. This includes abolition of the conditions that create and normalize Big Tech’s disproportionate reach over key ICT infrastructure, and their wide ranging negative consequences for society and the environment. We aim to retain — and expand — the many benefits that people currently derive from digital technologies, while better addressing their individual and collective needs.

As a concept this is really powerful. It reframes our relationship with technology and reconfigures both responsibility and community benefit.

The focus of today’s salon will be what this means and what this could look like. Answering the call or challenge that the authors of this paper have offered us.

By allowing corporations and governments to establish the rules regarding technology we have neglected the possibilities for expanding our own agencies, while many using these technologies are adversely affected by them. Resisting the capture of our information and communication infrastructures and redirecting resources to community-oriented and community-based initiatives becomes both more critical, and increasingly difficult, as technology is embedded more deeply, more thoroughly, and less transparently into our minds and bodies, our homes and cities, and the living environment. Now is the time to radically redirect the future of tech, by reclaiming the purposes of technology development, and redistributing the associated responsibilities and benefits, in the service of our collective and sustainable well being.

The current debate around antitrust provides an opportunity for this reframing to scale, both as a concept, and as a possibility. Precisely because antitrust is not enough, and instead we need to re-imagine what our digital world could be like.

The paper above is worth reading, and provides an interesting range of potential policy and community responses. There are a couple of other arguments that are also relevant for our discussion today.

Since the pandemic began, Apple, Alphabet, Amazon, Facebook, and Microsoft have seen their values increase by well over $1.7 trillion. Is it because these companies are offering technologies we all need or is it because they enjoy a series of monopolies that ensure greater wealth and control during a period of great uncertainty?

With so many people stuck at home, these internet-first companies were of course well-positioned to provide critical services during a pandemic. But they all got there by leveraging the labor of some of the most vulnerable populations in the world, extracting and selling the data of their customers, getting massive tax breaks, and otherwise taking advantage of huge weaknesses in our economic and political systems. With the economy and society falling apart, these massive companies—already monopolies during “normal” times—are becoming monolithic.

What, then, is to be done about these companies and their technologies which, on the one hand, facilitate unprecedented communication and address once intractable logistics challenges, but, on the other hand, contribute to widespread suffering everyday? Can we subordinate these technologies, whether they be algorithms or their data sets, to the ends of making a more fair social order? Put simply: Can we create technology that is owned by the people who use it, and whose main purpose is to help humanity rather than extract wealth for a small class of individuals?

Alternatives provide one scenario. Another, as we continue to discuss, is regulation.

We argue that Facebook and Google should be regulated as public utilities. Private powers who shape the fundamental terms of citizens’ common life should be held accountable to the public good. Online as well as offline, the infrastructure of the public sphere is a critical tool for communication and organization, political expression, and collective decisionmaking. By controlling how this infrastructure is designed and operated, Facebook and Google shape the content and character of our digital public sphere, concentrating not just economic power, but social and political power too. Leading American politicians from both sides of the aisle have begun to recognize this, whether Senator Elizabeth Warren or Representative David Cicilline, Senator Lindsey Graham or President Donald Trump.

Regulating Facebook and Google as public utilities would be a decisive assertion of public power that would strengthen and energize democracy. The public utility concept offers a dynamic and flexible set of regulatory tools to impose public oversight where corporations are affected by a public interest. We show how regulating Facebook and Google as public utilities would offer opportunities for regulatory innovation, experimenting with new mechanisms of decisionmaking that draw on the collective judgement of citizens, reforming sclerotic institutions of representation, and constructing new regulatory authorities to inform the governance of algorithms. Platform regulation is an opportunity to forge democratic unity by experimenting with different ways of asserting public power.

Founder and CEO Mark Zuckerberg famously quipped that “in a lot of ways Facebook is more like a government than a traditional company.” It is time we took this idea seriously. Internet platforms have understood for some time that their algorithmic infrastructure concentrates not only economic power, but social and political power too. The aim of regulating internet platforms as public utilities is to strengthen and energize democracy by reviving one of the most potent ideas of the United States’ founding: democracy requires diverse citizens to act with unity, and that, in turn, requires institutions that assert public control over private power. It is time we apply that idea to the governance of Facebook and Google.

Join us today at 1pm to explore this concept, what it means, and how it might scale.

For more background, here’s a segment and chat on the subject we had from last week:

Leave a Reply

Your email address will not be published. Required fields are marked *