Plans for self-driving cars have pitfall: the human brain

Photo by Antonio_Diaz/Getty Images; St. George News

DISTRICT OF COLUMBIA (AP) — Experts say the development of self-driving cars over the coming decade depends on an unreliable assumption by many automakers: that the humans in them will be ready to step in and take control if the car’s systems fail.

Instead, experience with automation in other modes of transportation like aviation and rail suggests that the strategy will lead to more deaths like that of a Florida Tesla driver in May.

Decades of research shows that people have a difficult time keeping their minds on boring tasks like monitoring systems that rarely fail and hardly ever require them to take action. The human brain continually seeks stimulation. If the mind isn’t engaged, it will wander until it finds something more interesting to think about. The more reliable the system, the more likely it is that attention will wane.

Automakers are in the process of adding increasingly automated systems that effectively drive cars in some or most circumstances, but still require the driver as a backup in case the vehicle encounters a situation unanticipated by its engineers.

Tesla’s Autopilot, for example, can steer itself within a lane and speed up or slow down based on surrounding traffic or on the driver’s set speed. It can change lanes with a flip of its signal, automatically apply brakes, or scan for parking spaces and parallel park on command.

Joshua Brown, a 40-year-old tech company owner from Canton, Ohio, who was an enthusiastic fan of the technology, was killed when neither he nor his Tesla Model S sedan’s Autopilot braked for a truck making a left turn on a highway near Gainsville, according to federal investigators and the automaker.

Kaushik Raghu, senior staff engineer at Audi, is reflected in the passenger side visor mirror while demonstrating an Audi self-driving vehicle on I-395 expressway in Arlington, Virginia, Friday, July 15, 2016 | Associated Press photo by Pablo Martinez Monsivais, St. George News
Kaushik Raghu, senior staff engineer at Audi, is reflected in the passenger side visor mirror while demonstrating an Audi self-driving vehicle on I-395 expressway in Arlington, Virginia, Friday, July 15, 2016 | Associated Press photo by Pablo Martinez Monsivais, St. George News

Tesla warns drivers to keep their hands on the wheel even though Autopilot is driving, or the vehicle will automatically slow to a stop. A self-driving system Audi plans to introduce in its 2018 A7, which the company says will be the most advanced on the market, monitors drivers’ head and eye movements, and automatically slows the car if the driver’s attention is diverted.

But Brown’s failure to brake means he either didn’t see the truck in his path or saw it too late to respond — an indication he was relying on the automation and his mind was elsewhere, said Missy Cummings, director of Duke University’s Humans and Autonomy Laboratory. The truck driver said he had heard a “Harry Potter” video playing in the car after the crash.

Drivers in these quasi- and partial modes of automation are a disaster in the making,” Cummings said. “If you have to rely on the human to see something and take action in anything less than several seconds, you are going to have an accident like we saw.”

Operators — an airline pilot, a train engineer or car driver — can lose awareness of their environment when they turn control over to automation, said Rob Molloy, the National Transportation Safety Board’s chief highway crash investigator.

He pointed to the crash of Air France Flight 447 into the Atlantic Ocean while flying from Brazil to France in 2007. A malfunction in equipment used to measure air speed caused the plane’s autopilot to disconnect, catching pilots by surprise. Confused, they caused an otherwise flyable plane to stall and fall from the sky, killing 228 people.

Planes and trains have had automation “for 20, 30 years and there are still times when they’re like, ‘Wow, we didn’t expect that to happen,'” Molloy said.

Part of the problem is overconfidence in the technology causes people to think they can check out. Not long after Tesla introduced its Autopilot system, people were posting videos of car with the self-driving mode engaged cruising down tree-lined roads or even highways with no one in the driver’s seat. Brown, for example, had posted videos lauding the Autopilot system and demonstrating it in action.

“There is a tendency of people to take one ride in one of these vehicles and then conclude that because they have not crashed over the course of 10 minutes that the system must be ready,” said Bryant Walker Smith, a University of South Carolina professor who studies the technology.

Some experts think the ability of people to monitor autonomous systems may be getting worse. With the advent of smartphones, people are accustomed to having their desire for mental stimulation satisfied immediately.

“Go into Starbucks, for example,” said Cummings. “No one can just patiently wait in line, they’re all doing something on their phones. It’s kind of pathetic.”

Some automakers may be rethinking their approach. Two years ago, General Motors announced it would start selling a Cadillac in the fall of 2016 that would almost drive itself on freeways. But in January the company confirmed that the project has been delayed for an unspecified reason.

In briefings, company executives said they were waiting to perfect methods of assuring that the driver pays attention to the road even when the system is on.

The system, called “Super Cruise,” will use cameras and radar to keep the car in the center of a lane and also stay a safe distance behind cars in front of it. The system will bring the car to a complete stop without driver action if traffic halts, and it can keep the car going in stop-and-go traffic. But it’s designed for use only on limited-access divided highways.

Google, meanwhile, is aiming for a car that’s fully self-driving and may not even have a steering wheel or brake pedals.

Written by JOAN LOWY, Associated Press. Associated Press writer Tom Krisher contributed to this report from Detroit.

Email: [email protected]

Twitter: @STGnews

Copyright 2016 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Free News Delivery by Email

Would you like to have the day's news stories delivered right to your inbox every evening? Enter your email below to start!


  • Don Bagley July 19, 2016 at 2:17 pm

    It’s always been said that the most dangerous part of a car is the nut behind the wheel.

  • Chris July 19, 2016 at 4:14 pm

    The human brain is fine…when it’s trained. Common sense, however, is the human’s BIGGEST pitfall. Without it, the brain is worthless.

  • DB July 19, 2016 at 5:02 pm

    Autopilots can be both good and bad. The article referred to airline flying, which I can relate to. An engaged autopilot can free up a part of your mind to tend to situational awareness (where are we, what’s going on around us, what is our ‘plan B’). I don’t think we’re quite there yet, but the same will be applied to autos, providing you aren’t using the time to text or eat your lunch.

  • ladybugavenger July 19, 2016 at 8:06 pm

    Can I please have a coffee maker brewing fresh coffee inside the car?

    • .... July 19, 2016 at 10:48 pm

      Yeah a pop up toaster and a microwave would be nice !

    • DB July 20, 2016 at 3:12 pm

      A 12 volt ‘Mr Coffee’ would be just fine. Just promise not to sue if you spill it in your lap. Use your ‘sippy cup’…

  • .... July 19, 2016 at 10:51 pm

    This article was published because of brain dead moments by RealLowlife and dumbob

  • Common Sense July 20, 2016 at 6:28 am

    Whats the point of a self driving car if you have to sit there and babysit it? This may be the perfect car for all those back seat drivers.

    • .... July 20, 2016 at 9:20 am

      Yeah perfect for RealLowlife and dumbob

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.