easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews.
Fostering a vibrant information ecosystem
In a crisis, our collective commitment to free speech can waiver, and we are more likely to desire the suppression or silencing of views that we disagree with or deem unacceptable.
I think this is dangerous, and threatens our informational immunity, as it undermines the health and diversity of our information ecosystem. Our critical thinking skills are like any other muscle, use it or lose it. If we’re not exposed to dissenting perspectives, disagreeable viewpoints, or even disinformation, we allow our critical capacities to atrophy, and our cognition could become complacent or lazy.
Rather a democratic society depends upon diverse perspectives, and we should regard all speech as symptoms of deeper causes, data that we can use to better understand what is going on and how people are doing.
Similarly laws that govern speech have evolved slowly and carefully. For example hate speech laws are reasonable and relatively limited, setting clear boundaries on what is acceptable, and what is not. In particular expressing violence against people, whether as a result of their identity, or for any purpose, should not be permitted, but it too provides a warning sign of larger currents of hatred or discontent that should be addressed.
However the issue of harm and speech is complicated. One of the classic examples is whether someone should be allowed to falsely scream “fire” in a crowded theatre, or exclaim that someone has a bomb while in an airport or on a plane. Doing so puts other people in harm’s way, and causes unnecessary panic and chaos, while also distracting emergency response personnel from more important tasks.
Yet this correlation between harm and speech can also be a slippery slope. In this current pandemic it may be abused as an excuse to suppress or silence speech that should otherwise be expressed, if only to be criticized and used as a learning opportunity.
In general the principle of free speech specifically involves protecting speech that you might otherwise find offensive or disagreeable. It’s about recognizing that your political opponents, or people you dislike, or people you think are foolish, have the same right to say what they wish as you do. You don’t have to listen to them, trust them, or believe them. You have the right to make up your own mind.
It’s also worth noting that the concept of free speech largely resides within the relationship between a citizen and the state. Our right to free speech is defined in relation to a government’s ability to silence or suppress that speech.
However free speech does not necessarily exist within the context of say a family or a private enterprise. Your parents probably didn’t permit you to say whatever you wanted, and you may choose to use your words carefully when speaking with your children. Similarly you can’t speak freely to your employer, your employees, or your clients/customers.
This is particularly why speech on private social media platforms remains so problematic. We culturally conceive of these platforms as semi public spaces, the contemporary political equivalent of the town square, which was a public space. However they are actually private property, and the owners or operators of that property can choose to govern speech on their platforms as they see fit. There is no (right to) free speech on social media.
Before this current pandemic subsumed us, content moderation policies on platforms had been politicized, and most of the companies leaned towards a relatively hands-off approach, which while arbitrary, allowed for content to remain, with some exceptions. One might argue that this moderation approach was so hands-off, that it fostered a culture of vigilantism, where people would attempt to police content themselves, by mobilizing outrage and demanding posts be deleted and accounts cancelled.
Now that we’re in the thick of it, platforms are no longer taking a hands-off approach, and instead are being more aggressive to remove content deemed unacceptable. In particular this applies to content about the pandemic, and about the science surrounding it. Anything outside of the poorly defined and frequently fluctuating norm, can and will be removed. Unless of course you’re the POTUS, in which cases the rules no longer apply.
The social companies have removed some, but let many others stand. The result is a community which is not emboldened to make claims about unproven cures and treatments. Poison control centers are reporting more cases, and people have caused themselves serious harm.
— Sheera Frenkel (@sheeraf) April 30, 2020
The rationale for this more aggressive approach is that it is all done in the interest of preventing harm and protecting public health measures. However as someone who does believe in free speech, I think it is a mistake, not just because it is ineffective, but also because it sets a dangerous precedent.
On the one hand, there’s the Streisand effect, which suggests that if you try and suppress or censor content on the Internet, you only make said content more popular (and appealing). On the other hand, there’s a real need to engage the public in widespread public health education, and suppressing unpopular or even stupid sentiment does not give people the opportunity to learn why that content may be wrong.
Similarly the process and rationale for removing content is opaque, and reflects an automated justice system that presumes guilt and requires the accused to prove their innocence. That’s a terrible precedent, and a dangerous model of justice that we should not allow these large companies to establish or practice.
Disinformation and conspiracy theories pose a significant threat to democratic institutions and methods, however censoring or suppressing such activity reinforces why they are a threat, and legitimizes their appeal (and arguments).
We’re not going to get through this crisis via authoritarianism and brute force. Rather the only path forward is via public buy in, and public health education. Helping people understand the science, and providing paths that they can chose to participate in collective preventative measures, while also providing means by which they can feed, house, and entertain themselves.
While experts have a role to play in this process, and in the larger debate, they should not be in a position to control it, let alone deem which speech is permitted and which is not. In his latest fantastic article, Ed Yong touches upon the role of experts in our current information ecosystem:
“The coronavirus not only co-opts our cells, but exploits our cognitive biases. Humans construct stories to wrangle meaning from uncertainty and purpose from chaos. We crave simple narratives, but the pandemic offers none,” @edyong209 writes. https://t.co/SAmJUW5SVE
— The Atlantic (@TheAtlantic) April 29, 2020
Bergstrom agrees that experts shouldn’t be dismissive gatekeepers. “There’s a lot of talent out there, and we need all hands on deck,” he says. For example, David Yu, a hockey analyst, created a tool that shows how predictions from the most influential COVID-19 model in the U.S. have changed over time. “Looking at that thing for, like, an hour helped me see things I hadn’t seen for three weeks,” Bergstrom says.
A lack of expertise becomes problematic when it’s combined with extreme overconfidence, and with society’s tendency to reward projected confidence over humility. “When scientists offer caveats instead of absolutes,” Gralinski says, “that uncertainty we’re trained to acknowledge makes it sound like no one knows what’s going on, and creates opportunities for people who present as skeptics.” Science itself isn’t free from that dynamic, either. Through flawed mechanisms like the Nobel Prize, the scientific world elevates individuals for work that is usually done by teams, and perpetuates the myth of the lone genius. Through attention, the media reward voices that are outspoken but not necessarily correct. Those voices are disproportionately male.
The idea that there are no experts is overly glib. The issue is that modern expertise tends to be deep, but narrow. Even within epidemiology, someone who studies infectious diseases knows more about epidemics than, say, someone who studies nutrition. But pandemics demand both depth and breadth of expertise. To work out if widespread testing is crucial for controlling the pandemic, listen to public-health experts; to work out if widespread testing is possible, listen to supply-chain experts. To determine if antibody tests can tell people if they’re immune to the coronavirus, listen to immunologists; to determine if such testing is actually a good idea, listen to ethicists, anthropologists, and historians of science. No one knows it all, and those who claim to should not be trusted.
In a pandemic, the strongest attractor of trust shouldn’t be confidence, but the recognition of one’s limits, the tendency to point at expertise beyond one’s own, and the willingness to work as part of a whole. “One signature a lot of these armchair epidemiologists have is a grand solution to everything,” Bergstrom says. “Usually we only see that coming from enormous research teams from the best schools, or someone’s basement.”
This is great advice, not just for us as individuals attempting to sort through an abundance of (potentially false) information, but for a society that seems reluctant to fully embrace the nuance of public debate. Not only is there no monopoly on expertise, but a wide range of experts are necessary to helping us all understand the big picture.
The danger emerges along side the lack of humility. Yet I see this lack of humility among the zealots who seek to suppress or silence speech. While we should be wary of grand solutions, we should also be wary of attempts to silence disinformation rather than debunk it.
The problem with social media platforms is not that they enable a diversity of perspectives, but rather that they enable the amplification of nonsense. The problem is not people’s ability to post uninformed or misleading arguments, but the platform’s propensity to blindly amplify and spread them.
Rather than trying to silence speech, we should look at how social media platforms work, and pursue governance models that foster platform responsibility without punishing individual users.
Digital monopolies continue to build empires using automated tools rather than responsible ones. They do so via the activity and content created by users, yet they offer these users no protections and a justice system that presumes guilt rather than innocence.
Free speech has never been easy, and in a pandemic, we should embrace it, rather than fear it. What do you think? What do you have to say?
For more on this subject, check out the podcast that Mark Jeftovic and I are part of: