Skip to Content
77 Mass Ave

A Stethoscope for Cars

Phone app analyzes vehicle sounds to diagnose issues before you break down.
December 19, 2017
leon edler

Imagine hopping into a ride-share car, glancing at your smartphone, and telling the driver that the car’s left front tire needs air, its air filter should be replaced next week, and its engine needs two new spark plugs.

Within the next year or two, people may be able to get that kind of diagnostic information in just a few minutes, in their own cars or any car they happen to be in. They wouldn’t need to know anything about the car’s history or to connect to it in any way; the information would be derived from analyzing the car’s sounds and vibrations, as measured by a phone’s microphone and accelerometers.

The technology began as the doctoral thesis of MIT scientist Joshua Siegel, PhD ’16, who worked with mechanical engineering professor Sanjay Sarma. A smartphone app combining the various diagnostic systems the team developed could save the average driver $125 a year and slightly improve overall gas mileage, Siegel says.

With today’s smartphones, Siegel explains, “the sensitivity is so high, you can do a good job [of detecting the relevant signals] without needing any special connection.”

The basic idea is to provide diagnostic information that can warn the driver of upcoming issues or needed routine maintenance, before these conditions lead to breakdowns or blowouts. For example, an engine’s sounds alone can indicate how clogged the air filter is and when to change it. “We’re listening to the car’s breathing, and listening for when it starts to snore,” Siegel says. “As it starts to get clogged, it makes a whistling noise as air is drawn in. Listening to it, you can’t differentiate it from the other engine noise, but your phone can.”

Many of the diagnostics are derived by using machine learning to compare many recordings of sound and vibration from well-tuned cars with similar ones from cars that have a specific problem. The systems can then extract even very subtle differences. For example, algorithms designed to detect wheel balance problems did so more successfully than expert drivers from a major car company, Siegel says.

A prototype smartphone app that incorporates all these diagnostic tools is being developed and should be ready for field testing in about six months, Siegel says. He has founded a startup company called Data Driven to commercialize it.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.