Assessing the spread of covid surveillance

easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews

What is the weight and responsibility of this data?

 
We had a fantastic salon yesterday, thanks to those of you who were able to attend and participate. Next Salon is Tuesday November 24th at 10am Eastern with animal pre-show at 9:30am.

If we are to accept that all data, is health data, then we also ought to entertain the possibility that data transforms us, at the very least in the form of feedback.

The more data we have about or body, and our health, the greater the likelihood our behaviour will change, hopefully as a means of improving our health, but that’s not always the case.

Extend this further, and all data impacts our health, as all data can be health data, and in some form feedback to us and alter our behaviour.

In a pandemic this dynamic is pronounced and obvious. I’m avoiding confined or even indoor spaces, with few exceptions, that are engaged strategically and with rapid precision. I do this in response to my understanding of available medical research and the risks I face.

However whether something is obvious or not depends upon context. In our salon this week Neil Andersen raised the issue of increased sedentary behaviour, and the toll it’s taking on people, like teachers, who have to spend a lot of their working day in online meetings/classes. It reminded me I need to stand up and walk around/go outside more frequently.

There’s a certain responsibility that comes with knowledge or feedback. Once you feel the pain you want to do something about it. Once a problem is diagnosed there is a desire to resolve it.

Early on in this pandemic we discussed our dangerous obsession with data, and how it would drive both the comprehension and collective response to this crisis. While this obsession with data has certainly encouraged a dramatic expansion of surveillance practices (and methods) it has not resulted in the successful containment of the virus or the necessary shift in public behaviour and attitudes.

Knowing is not enough. Nor does surveillance result in control.

However knowing does have an impact, and surveillance does encourage a desire to control things that were not previously perceived as controllable. Even if such efforts remain futile and out of reach.

In today’s issue, let’s do a brief scan of some of the pandemic induced surveillance trends. Specifically I want to anticipate what the response will be to the data that is being collected post-pandemic, assuming such a period will arrive. Not so much a privacy or data protection response, which is relevant, but a “now that we have it, what are we going to do with it” response that indicates new agency and capability.

Certainly that’s the guise of the current new wave of applications. Justifying increased surveillance via a range of means from health, to security, to the privilege of attending the concert.

For some, like Ticketmaster and the broader event industry, it is an existential issue. They’ve got to find a way to restart their businesses. Yet scale that up across the economy and there’s a strong impetus and motive to use surveillance as a means of managing risk within the pandemic.

In Rochester, Mich., Oakland University is preparing to hand out wearable devices to students that log skin temperature once a minute — or more than 1,400 times per day — in the hopes of pinpointing early signs of the coronavirus.

In Plano, Texas, employees at the headquarters of Rent-A-Center recently started wearing proximity detectors that log their close contacts with one another and can be used to alert them to possible virus exposure.

And in Knoxville, students on the University of Tennessee football team tuck proximity trackers under their shoulder pads during games — allowing the team’s medical director to trace which players may have spent more than 15 minutes near a teammate or an opposing player.

The powerful new surveillance systems, wearable devices that continuously monitor users, are the latest high-tech gadgets to emerge in the battle to hinder the coronavirus. Some sports leagues, factories and nursing homes have already deployed them. Resorts are rushing to adopt them. A few schools are preparing to try them. And the conference industry is eyeing them as a potential tool to help reopen convention centers.

On the one hand I can sympathize with the desire to manage risk and find a means by which activities and operations can resume. However on the other hand, these are not superficial measures, but represent seismic shifts in the surveillance regimes we’ve previously tolerated and arguably failed to regulate.

Civil rights and privacy experts warn that the spread of such wearable continuous-monitoring devices could lead to new forms of surveillance that outlast the pandemic — ushering into the real world the same kind of extensive tracking that companies like Facebook and Google have instituted online. They also caution that some wearable sensors could enable employers, colleges or law enforcement agencies to reconstruct people’s locations or social networks, chilling their ability to meet and speak freely. And they say these data-mining risks could disproportionately affect certain workers or students, like undocumented immigrants or political activists.

“It’s chilling that these invasive and unproven devices could become a condition for keeping our jobs, attending school or taking part in public life,” said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, a nonprofit in Manhattan. “Even worse, there’s nothing to stop police or ICE from requiring schools and employers to hand over this data.”

In particular I think we underestimate that the real applications of this data are yet to be discovered, and will probably be far more profound than we can currently imagine. The potential for mental health diagnosis is but one area that is already problematic, and with greater data, possibly predatory.

Ideally data protection and privacy frameworks would be in place that allowed for data to be collected for specific purposes and not reused for others. However we don’t live in that world, at least from a practical purpose, and people have reason to be concerned with how their personal information can and will be used against them.

Especially given the super spy device they already have in their pocket. No surprise that the smartphone has been a massive focal point for pandemic related and not related surveillance.

WHEN THE NOTION of enlisting smartphones to help fight the Covid-19 pandemic first surfaced last spring, it sparked a months-long debate: Should apps collect location data, which could help with contact tracing but potentially reveal sensitive information? Or should they take a more limited approach, only measuring Bluetooth-based proximity to other phones? Now, a broad survey of hundreds of Covid-related apps reveals that the answer is all of the above. And that’s made the Covid app ecosystem a kind of wild, sprawling landscape, full of potential privacy pitfalls.

Late last month Jonathan Albright, director of the Digital Forensics Initiative at the Tow Center for Digital Journalism, released the results of his analysis of 493 Covid-related iOS apps across dozens of countries. His study of those apps, which tackle everything from symptom-tracking to telehealth consultations to contact tracing, catalogs the data permissions each one requests. At WIRED’s request, Albright then broke down the data set further to focus specifically on the 359 apps that handle contact tracing, exposure notification, screening, reporting, workplace monitoring, and Covid information from public health authorities around the globe.

The results show that only 47 of that subset of 359 apps use Google and Apple’s more privacy-friendly exposure-notification system, which restricts apps to only Bluetooth data collection. More than six out of seven Covid-focused iOS apps worldwide are free to request whatever privacy permissions they want, with 59 percent asking for a user’s location when in use and 43 percent tracking location at all times. Albright found that 44 percent of Covid apps on iOS asked for access to the phone’s camera, 22 percent of apps asked for access to the user’s microphone, 32 percent asked for access to their photos, and 11 percent asked for access to their contacts.

“It’s hard to justify why a lot of these apps would need your constant location, your microphone, your photo library,” Albright says. He warns that even for Covid-tracking apps built by universities or government agencies—often at the local level—that introduces the risk that private data, sometimes linked with health information, could end up out of users’ control. “We have a bunch of different, smaller public entities that are more or less developing their own apps, sometimes with third parties. And we don’t we don’t know where the data’s going.”

Here’s the ongoing resources that Jonathan is maintaining:

Although it’s not just smart phones. In environments like Universities where certain conditions can be imposed upon subjects, surveillance is expanding to saliva:

Scale that up to a larger community and wastewater surveillance has been helpful. This thread has some relevant detail.

Although in both saliva and wastewater there is a slippery slope in that both offer tremendous surveillance and data opportunities to scan for other things, such as narcotics or other infectious diseases.

We’ve previously discussed the potential for ventilation systems to play a role in managing the pandemic, apparently that also includes a potential for surveillance:

Testing what’s in the air generally seems like a good thing to do overall, but that may be a result of my inability to think of an obviously sinister application.

The larger argument here is that increased surveillance is a misguided means of either asserting control or translating how we used to do things in the pre-pandemic world to the world we know find ourselves in.

This is true in education.

As well as in the workplace.

Creating an overall inflationary effect that normalizes and accelerates the surveillance state.

The changes wrought by Covid-19 risk increasing complacency among policymakers about using controversial surveillance technologies.

But tools implemented in the emergency context of the pandemic should not automatically cross over from public health purposes to policing, national security or political applications, as reportedly happened in Minnesota where authorities used contact tracing applications to track Black Lives Matter protesters.

Arguments that ever more intrusive forms of surveillance are necessary or inevitable even in democracies serve a range of powerful agendas with fundamentally anti-democratic effects.

The proliferation of these technologies risks entrenching dangerous power imbalances all the way up from the private, domestic sphere through the relationship between national governments and their citizens, to international divisions between authoritarian and democratic states.

While it may be obvious given our critical distance to see the surveillance state in China:

Similar links within the industrial military surveillance complex exist within the western or anglo world:

The twitter thread linked to above is thorough if it isn’t (appropriately) paranoid. Out of scope for today’s issue, but relevant given which companies have the capacity for society scale surveillance and analytics.

Although before we end, let us do note that sometimes human based surveillance can be replaced by canid surveillance.

However as a dog lover I’m a bit conflicted knowing that dogs can become infected with Covid and while having them sniff out potential infection seems appealing, at what cost? 🙂

Finally here’s a podcast produced by a reputable outfit that borders on such poor production quality that it encourages me to make more media in spite of my consistently craptacular connection.

Leave a Reply

Your email address will not be published. Required fields are marked *