When your boss is an algorithm

easyDNS is pleased to sponsor Jesse Hirsh‘s “Future Fibre / Future Tools” segments of his new email list, Metaviews

The ongoing battle for algorithmic transparency

 
We live in a black box society where many of the decisions made about us are done by algorithms and machine learning models that are opaque, indecipherable, and unaccountable. This poses an existential threat to democracy, fairness, and arguably our ability to make a decent living.

The struggle to understand how these algorithms work, and make them accountable, has been an uphill battle, as the call for algorithmic transparency have not made much traction. Part of the resistance has come from technology companies who want to defend their intellectual property, and falsely argue that transparency is not technically possible.

While algorithmic transparency has largely remained a concept in public policy circles, the courts are now being asked to wade in. Uber workers in Europe are demanding the right to understand how their automated managers operate, and what they need to know to do their jobs effectively.

A union acting on behalf of Uber drivers in the UK is suing the company in the Netherlands, Uber’s European headquarters, to clarify how it uses data to match riders with drivers.

The App Drivers and Couriers Union (ADCU) claims Uber monitors the performance of drivers by tagging their profiles with information about late arrivals, cancellations, and complaints about attitude and inappropriate behaviour from customers.

Despite multiple requests, the suit says company has provided drivers with little to no access to personal data.

Considerable research has been done on the way in which Uber’s algorithm is designed to control and manipulate drivers.

There are nearly a million active Uber drivers in the United States and Canada, and none of them have human supervisors. It’s better than having a real boss, one driver in the Boston area told me, “except when something goes wrong.”

When something does go wrong, Uber drivers can’t tell the boss or a co-worker. They can call or write to “community support,” but the results can be enraging. Cecily McCall, an African-American driver from Pompano Beach, Fla., told me that a passenger once called her “dumb” and “stupid,” using a racial epithet, so she ended the trip early. She wrote to a support rep to explain why and got what seemed like a robotic response: “We’re sorry to hear about this. We appreciate you taking the time to contact us and share details.”

The rep offered not to match her with that same passenger again. Disgusted, Ms. McCall wrote back, “So that means the next person that picks him up he will do the same while the driver gets deactivated” — fired by the algorithm — because of a low rating or complaint from an angry passenger. “Welcome to America.”

On a basic level the algorithm controls the driver by directing the route they take while driving, however questions remain as to how deep this influence goes.

This is one of the problems with a black box system. We have no credible idea as to how they work or what outcomes they wish to engineer.

The opportunity in this lawsuit, is to use provisions in the EU GDPR to compel the company to disclose how their algorithm works, since it is based on EU citizen data. Similarly the GDPR also includes a “right to explain” which gives citizens the right to get an explanation when an algorithm makes a decision about them.

Central to the argument here is that the drivers—and the UK App Drivers and Couriers Union—are arguing that the GDPR allows for them to see the troves of data Uber extracts and accumulates on them, in addition to how that data is fed into Uber’s system of algorithmic overseers that manages and control drivers more tightly than human bosses could ever hope to.

Under the GDPR, individuals have a right to access personal data held by any company, in this case drivers are arguing that refusal to share time spent logged on and GDPR data obscured the “dead mileage” or miles driven in between trips without pay—such data would be essential to calculate an accurate hourly wage, but likely reveal how low Uber’s actual pay was. In combination with other data such as trip ratings, drivers might also be able to get a better idea of why they were deactivated on the platform—Uber, for years, has come under fire for seemingly arbitrary reasons without due process because they are contractors, not employees.

The case is being heard in Amsterdam which, ironically enough, was once the crown jewel of Uber’s “Double Dutch” tax evasion strategy that created Dutch entities with no employees to transfer revenues to tax havens. Recent regulatory crackdowns have pushed Uber to end Double Dutch.

Uber may be a trail blazer when it comes to algorithmic management, but they’re certainly not alone. Workers deserve to understand the conditions of their work, and the expectations and operations of their management.

The relevance of this particular dispute is reinforced by the changing nature of work in this pandemic. Workplace surveillance has extended into the home, following remote workers as they perform their duties remotely.

The sheer level of surveillance that has arisen as a result of this pandemic induced shift to virtual environments means that algorithms will be required to make sense of all that data and monitoring.

This may not immediately translate into algorithmic management, but it does augment human managers with machine learning tools that ought to be transparent (if they’re permitted at all).

After all it is important to remember what the gig economy or sharing economy is all about. The disruption of regulators and the exploitation of labour:

The EU decision, the UK Supreme Court decision, and the US lawsuits all have the potential to undermine a key plank of Uber’s business model: exploitation. Once you strip away the ability of an Uber or Lyft to misclassify drivers or shield data from public eyes, you start to see it for what it is. App-based ride-hailing companies are glorified taxi companies that exploited venture capitalists to gain billions more in funding than their competitors, exploited drivers and consumers with subsidies and psychology tricks to achieve mass adoption, exploited weak labor regulations and consumer protections to cut costs (wages) and increase revenues (fees, prices), then obscure what was going on behind the self-adopted label of “tech company,” an app, and a host of secret algorithms. These legal challenges are a step in the right direction to end that widespread exploitation, reverse its damage, and start rebuilding a (public) transit system that prioritizes those who use it (and operate it) over those who dream of owning it and making billions off of it.

While Uber’s false classification of their workforce as independent contractors is coming undone thanks to the work of labour organizers and the integrity of the judiciary, the next key battle is algorithmic transparency.

Uber offers a glimpse into the future of work, as highly automated, and subject to acute algorithmically manipulation.

Although Uber is not alone, as an entire sector of companies has emerged, often finding success via the phrase: “We’re the Uber of X”.

We’re getting quite used to our algorithmic overlords. We’ve ceded, for the most part, that complex and invisible rulesets determine who will see our missives, travel pics, and RT dunks. More substantially, millions of workers now toil, essentially, for algorithms, whether via Uber, Lyft, Postmates, or the like. And the DoorDash tipping fiasco that unfolded this week highlights how increasingly dangerous this is—both in terms of the worker exploitation that nebulous algorithmic employment allowed for in the first place, and in the fractious and sometimes surprising nature of the fallout.

When DoorDash, which, with 400,000 contract workers is the largest on-demand food delivery service in the nation, faced fresh criticism over its deceptive tipping policies—the app used tips from consumers to pay out the minimum delivery fee it promised its gig workers, called ‘Dashers’, instead of letting them keep the whole tip themselves, essentially putting the tip directly in DoorDash’s coffers—it finally capitulated. After six months of refusing to do so, CEO Tony Xu announced on Twitter he’d be changing the policy.

I find it fascinating to read these articles from a year ago, as pre-pandemic does seem like a lifetime ago. However it is worth reflecting just how much the pandemic has been a boon to companies like DoorDash. Their business model was still being tested pre-pandemic, whereas now, for some, they’ve become essential services.

Perhaps like all essential workers, we should be taking a closer look at their working conditions, their wages, and support calls to improve both.

This is why efforts to organizing against algorithmic managers is so crucial, in the present, and for the future.

Now, there are a lot of workers, from Uber drivers to Dashers, who are organizing for change. Groups like Gig Workers Rising are gaining steam. But the blowback on Dasher forums highlights the steepness of their challenge: By imparting the rules and expectations of the job onto a faceless algorithm—rules that govern pickup rates, bonus terms, and payscale, and rules that are always wildly in flux—on-demand app companies have fundamentally altered how workers perceive and engage with the authorities that manage them.

They’ve steered the focus onto the nature and fairness of the algorithm—which workers must spend their own time and resources dissecting and strategizing against—as opposed to the nature and fairness of the bosses who wrote it and deployed it. The phenomenon jibes with the other vagaries of digital algorithms we’ve accepted as a fact of modern life lived from platform to platform—all of whose expansiveness can make them feel daunting if not impossible to change.

This state of rampant uncertainty and inscrutability is precisely how DoorDash—and nearly every other app-based company that uses an algorithm to connect independent contracts to low-paid gig work—prefers it. On-demand app workers have more or less been made to surrender any expectations of transparency or reliability to the opaque, proprietary algorithms that define how much they work they get and how much they ultimately earn. (Talk to any Uber driver long enough—even generally satisfied ones—and you’ll hear gripes about shifting bonus goalposts or mysterious pickup logic.)

Though it may be worth stating that it is not just companies in the gig or sharing economy who are adopting these practices. Algorithms as managers and supervisors are proliferating through the workforce, and perhaps even more so due to the pandemic.


We can and must learn from these preliminary algorithmic bosses, in particular recognizing that as automation is adopted for the purposes of managing humans, it must be transparent, and accountable.

Leave a Reply

Your email address will not be published. Required fields are marked *