Skip to Content

When Your Boss Is an Uber Algorithm

Researchers examine how Uber steers its drivers’ behavior with its automated management system, despite its promise of letting you be “your own boss.”
December 1, 2015

It sends texts with last-minute requests for extra shifts and won’t consider requests for a raise. Yes, the software that manages drivers for Uber has a few things in common with unpopular human bosses.

Researchers say the company’s automated management system, and that of its competitor Lyft, creates new dynamics between workers and their paymasters that should be scrutinized. A paper presented last week by Data & Society, a nonprofit research institute in New York City, recounts ways that Uber’s system steers its drivers’ behavior and suggests that regulators may need to consider them.

Uber calls its drivers “independent contractors” and solicits new ones with the promise of being “your own boss.” Alex Rosenblat, a researcher at Data & Society and coauthor of the new paper, interviewed a handful of drivers, talked less formally with many others from the back of an Uber, and monitored nearly a year of discussions by drivers in online forums. She says the reality of being an Uber contractor is different. “Uber does control a lot of the ways that drivers behave on the job,” she says.

Drivers usually have to meet a human when they first sign up to drive for Uber or Lyft. After that they interact with an automated management system primarily delivered by a mobile app.

When a driver is logged in, the app assigns them pickup requests from people nearby. The system metes out feedback by tracking the proportion of pickups a driver accepts (Lyft and Uber each give drivers 15 seconds to decide), and averaging the rating that passengers give their driver after a ride. Drivers can be suspended for not accepting enough rides, or for low passenger ratings; they are incentivized to work at particular times, or in particular places, by “surge pricing” that temporarily raises fares.

Some passengers dislike surge pricing, but Uber’s CEO Travis Kalanick has defended it as a straightforward mechanism to reconcile supply and demand (see “In Praise of Efficient Price Gouging”).

Drivers sometimes receive messages urging them to work on a particular day at a particular time because heavy demand is forecast, for example. And they reported receiving surveys asking when exactly they will be driving in coming days.

Rosenblat says that shows Uber doesn’t just let the market decide. Drivers have to determine how much they trust the company to help them earn more—and many resent or resist Uber’s nudges, she says.

“Uber says it wants to act as a neutral intermediary that connects supply and demand with an automated mechanism for finding the right price,” says Rosenblat. “It’s difficult to argue that you’re a neutral platform if you’re actively trying to manage supply and demand.”

Uber also uses its passenger feedback system as a lever on driver behavior, says Rosenblat. Drivers are highly motivated to keep their score up to avoid being cut off. The company sometimes sends out e-mails with very specific advice about what drivers can do—such as offering snacks or not talking about other business interests—to earn positive ratings.

Rosenblat says that regulators should consider whether the way systems like Uber’s can influence workers’ behavior could go too far. She says the U.S. Federal Trade Commission has shown an interest in her research. She worked with Luke Stark of New York University, and presented their paper at a conference on labor in the on-demand economy at the Centre for European Policy Studies in Brussels, Belgium, last week.

Min Kyung Lee, a research scientist at Carnegie Mellon University’s Center for Machine Learning, published her own study of Uber and Lyft drivers earlier this year.

She found that much of the time they were happy with the “algorithmic management” that assigned fares and raised rates during busy periods. But drivers also complained that they were sometimes pushed to do things that seemed unreasonable, such as make pickups that weren’t nearby.

Lee and Rosenblat both say they have had friendly interactions with Uber, which has shown an interest in their research. Uber spokesperson Jessica Santillo says the company provides information to drivers to help them grow their income, and that they are free to ignore any suggestions. “Drivers use Uber on their own terms; they control their use of the app,” she says.

Uber is facing a class action lawsuit from people that have driven for the company that hinges on how much control the company has over drivers. The suit, heading to trial next summer, alleges that Uber has enough control that drivers should be considered employees and not contractors, and given the attendant benefits and legal standing.

Rosenblat says that awareness of that looming legal problem has made some drivers more likely to question their relationship with Uber. “As drivers have become more cognizant of the independence they should have as contractors, they’ve started to ask things like, ‘Why can’t I set the base rate for a fare?’” she says. Uber’s minimum fares are as low as $5—before Uber’s cut—in some U.S. cities.

Lee says such tensions could become more widespread as Uber-style companies become more powerful, and automated management perhaps spreads to other domains. “I see a lot of good things in using algorithms like this, but at the same time we need to be careful how we design them,” says Lee.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.