Can AI prevent future pandemics?

easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews

Or will TikTok cults make pandemics irrelevant

 
I’m physically of the belief that the previous frame by which we understood our world is gone, and we’re currently in the process of constructing new frames. Plural in the sense that what this crisis may mark a split where we no longer think of society, but rather societies.

For me this is largely a physical sensation, as it is something I’m feeling, more than thinking. I imagine the latter will happen, although that may take some effort, time, and reflection.

The concept of frames appeals to me, as it is a powerful way to understand how people see the world, or more accurately how they fit the world into constraints that make it easy for them to understand what it is they wish to know.

As a wearer of spectacles, this metaphor appeals to me, as without corrective lenses, I wouldn’t be able to see much. This physical relationship I have with the world gives me reason to translate that into a cultural understanding, that everyone adopts corrective lenses, of a sort, that allow them to see things that would otherwise remain blurry or indecipherable.

Artificial Intelligence, as a vague and radically inclusive concept, is an example of a frame that was ascendant before this pandemic, and is currently poised to become dominant. As a kind of corrective lens, AI offers a frame that promises certainty, and teases us with a new kind of order to be crafted from the chaos of big data.

The more data we have, the foggier our vision, and the more we find ourselves confused. Yet empowered with the frame of AI, we have the hope to make sense of our world, or as the dream goes, make our world as we wish.

Could AI have prevented this pandemic? Can it prevent future ones?

Dr. Kamran Khan set out to make a “smoke alarm” that would detect disease outbreaks around the world when he created his pandemic-predicting software BlueDot.

Khan and his team of about 50 experts used big data and artificial intelligence to warn the world of a potentially serious viral outbreak three days before the World Health Organization, though they picked up on the signs even earlier.

Waiting for outbreaks to be declared typically takes too long, the University of Toronto professor of medicine and public health says, and the information often takes a long time to make it into the hands of the medical community and the public.

The world is changing, he says, and diseases are emerging with greater frequency and having bigger impacts.

Big data and artificial intelligence can provide a bird’s-eye view of diseases around the globe in real time, letting people move faster to quash new outbreaks.

It’s time we start using them, for the second wave of the COVID-19 pandemic and beyond, Khan says.

We published an issue about BlueDot in the context of infodemiology and disinformation on January 30th. It’s weird reading that issue now, as it feels like a world away. It reminds me that knowing a crisis is on its way does not necessarily change the way that crisis plays out.

Is this not the default stance of pundits and experts? Everything would have been different if we only listened to them? That the cure to all that ails us is faith in the few people or smart machines that claim to have all the answers?

BlueDot does deserve credit for calling this outbreak before anyone else, however what impact did that really have? What impact will their prediction of a second wave have? Does it really matter what an educated or technocratic elite think of the rest of the population, or enough of the population thinks otherwise?

One of the key insights or lessons that I’ve learned from this crisis is that smart machines are useless and ineffective on their own.

In a world of everyone for themselves, smart machines offer competitive advantages. However in a world of we’re all in this together, smart commons are what matter. When we’re all in this together, the base level of knowledge or intelligence is what influences how we fare and what we choose to do.

Could this crisis help us appreciate the role of commons, but also the importance and value of the knowledge commons? Could it shift our perception of social media from being superficial, to actually being essential to our collective central nervous system?

THE COVID-19 PANDEMIC has spurred what’s often called an infodemic, or the alarming spread of harmful information online. But this narrative misses the ways in which the Covid-19 crisis is simultaneously driving groups and individuals to collaboratively generate huge amounts of useful, public knowledge.

In a race to create and share resources to weather the pandemic’s challenges, communities have ushered in a golden age of a little-known economic concept: the knowledge commons. Popularized by political economist Elinor Ostrom and researcher Charlotte Hess, the term refers to an accessible repository of knowledge, usually focused on specific topics, that is collectively owned and governed by a community for mutual gain. Many knowledge commons, such as Wikipedia, GitHub, and SSRN, existed before the pandemic. But in the last ten weeks, the creation and use of knowledge commons has exploded.

Neighborhoods are creating Slack groups and communities are coming up with mutual aid spreadsheets to coordinate aid and support each other. Neighbors share information about nearby resources like food pantries, stores or restaurants still open, strategies for negotiating or deferring rent, and help to match individuals in need to individuals that can assist with errands or donations.

What might be most transformative about this shift, is the participatory nature of the knowledge commons. We had been clinging to the passive relationship that legacy media fosters with their audience, and this pandemic has incentivized many to become more active. I’m cautiously optimistic that this cultural shift will have a lasting effect.

The pandemic response has shown the benefits of a commons-based model of knowledge production and sharing. It has demonstrated the transformative force of free knowledge and collaboration for the common good, as well as its power in enabling people to come together and help one another. Most importantly, it’s established a precedent that this level of community self-organization, and this volume of sharing , even among commercial entities, is possible. The bar has been set—now the question is how we can ensure that it stays.

As a society this is both an important precedent, and valuable development, that has considerable momentum.

However what if, as I alluded to earlier, we’re no longer a society. Instead we’ve split into societies, that while potentially resembling each other, are increasingly distinct. I see this as the evolution of filter bubbles or echo chambers, which were merely dress rehearsals for the larger mission of building distinct societies.

A glimpse of what I mean by this can be found in the emerging phenomena of TikTok cults. Social formations that combine active participation, self-organizing communities, and the larger desire for purpose and belonging.

Cults on TikTok aren’t the ideological ones most people are familiar with. Instead, they are open fandoms revolving around a single creator. Much like the “stans” of pop figures and franchises, members of TikTok cults stream songs, buy merch, create news update accounts and fervently defend their leaders in the comment sections of posts. The biggest difference is that TikTok’s cult leaders are not independently famous. They’re upstart creators building a fan base on social media.

Ms. Ong represents a relatively new kind of influencer, one who has seized a time of great isolation and idleness to capture the interest of a rapt user base.

“I made this video where I was speaking into my phone camera like, ‘Hey guys I think we should start a religion,’” she said in a phone interview on Friday. “Then, I was like, ‘Let’s start a cult.’”

TikTok users have been forming cults (of personality) and armies (the nonviolent kind) for months now, borrowing tactics from comment raid groups on other platforms. The Dum Dum gang, for instance, gained a following last year by taking over the comment sections of public figures like Barack Obama and Mark Zuckerberg.

In January, thousands of teenagers on TikTok created a Lego Star Wars army. In April, users took sides in a purple vs. green alien “gang war.” And on May 8, the Step Chickens were born.

The name comes from a video series Ms. Ong shared on TikTok called “CornHub,” in which she parodies pornographic tropes including one where a stepbrother seduces a stepsister. Ms. Ong reenacted the plot wearing a chicken suit; the video racked up 1.1 million views.

As Ms. Ong began amassing followers, she implored them to change their profile pictures to her blue selfie. Their first mission was to raid the comments of Phil Swift, the creator of the widely memed home-repair product Flex Tape. Hundreds of her fans began commenting on Mr. Swift’s videos; within 48 hours, he changed his avatar to Ms. Ong’s face.

While I’m no longer browsing TikTok daily, I am still fascinated by the platform and its resident cultures. However before reading this article, I had no clue that this activity was taking place. I had seen the lego star wars army stuff, and even had someone insist I change my avatar, but it never occurred to me that it was part of a larger narrative.

This speaks to how subcultures can remain invisible to non-members until said cultures rise to the point of dominance.

This story about The Stepchickens is fascinating, and the following post delves into it in a bit more detail:

If an internet celebrity or content creator doesn’t get paid per view, then “fans” aren’t worth so much. Revenue sharing platforms like YouTube and Twitch reward influencers for having legions of casual followers who rack up ad impression. But you need a different strategy on platforms like TikTok, Twitter, Instagram, and Snapchat that don’t share ad revenue, where calls to action are limited, and most people just want to like, maybe share, and keep on scrolling. If a creator wants to make sponsored content, they could monetize their reach, but that requires business savvy, brand safety, and creative compromise.

That’s why influencers don’t just want fans. They want a cult. They want loyalists willing to do as they command, withstanding the friction of leaving their favorite feed to take actions that benefit their glorious overlords. While the term ‘cult’ might be a bit insensitive, it appropriately describes the obsessive devotion that communities give to their charismatic leaders. In exchange, they dispense a sense of belonging. Yes, TikTok is keeping kids out of gangs by getting them to join cults.

All of this is a natural extension of the extremist bias that has haunted social media over the last decade.

The logic of the platform was to keep users engaged for as long as possible. This was not an ideological desire per se, but an economic one. The more time on the platform, the more opportunity for revenues via ads. However because there was no ideology or framework for this pursuit, it was left to the algorithms to accomplish. These automated guides evolved to emphasize extremist content as a way of keeping people engaged.

Eventually researchers were able to reverse engineer that this kind of automated recommendation leads to extremism, however that didn’t end up stopping the process. Now we’re seeing it evolve further into the business model of influencers.

We’re going to see more creators try to prod their followings along this path from fan to cultist:

Consuming – Serendipitously discovering a creator’s content

Following – Subscribing to a creator across channels

Sharing – Re-distributing a creator’s content consistently

Creating – Amplifying a creator via remixes and references

Affiliating – Joining communities of a creator’s allies

Buying – Spending to support a creator or prove allegiance

Transforming – Redefining one’s identity around a creator

A recurring joke I like to make is that the politicians of the future are YouTubers today. However clear that’s now outdated. The dictators of the future are TikTok cult leaders today.

What TikTok cults do offer, is a new frame. A way of seeing the world as part of an emerging society that is not tied to the old regime. We should not be quick to dismiss these frames, or the societies that form around them. They offer clues as to how our world is changing, and what kind of regimes we should anticipate as a response.

Leave a Reply

Your email address will not be published. Required fields are marked *