In California tests, self-driving cars still need human help

Friday, January 8, 2016, Vol. 40, No. 2

LOS ANGELES (AP) — Futuristic self-driving cars traveling along California roads have needed plenty of old-fashioned human intervention to stay safe.

California's Department of Motor Vehicles on Tuesday released reports filed by seven companies the agency gave permission to test prototype vehicles in public. The documents summarized instances in which a human driver had to take over due to technology problems or other safety concerns.

The reports show wildly different levels of success since on-road testing started in September 2014.

Experts in the technology said Google, whose cars drove the most by far, performed relatively well, though they also cautioned that the testing typically happened during good weather. Other companies reported frequent instances in which the person who is required to be in the front seat — just in case — had to grab the wheel.

Nissan, for example, tested just 1,485 miles in public, but reported 106 cases where the driver had to take control. The automaker has said it plans to have "commercially viable autonomous drive vehicles" by 2020. A spokeswoman did not return a request for comment.

Google said its cars needed human help 341 times over 424,000 miles. That would be the equivalent of about 10 times per year, given the 12,000 miles the average U.S. vehicle travels annually.

In 11 of the 341 instances, Google said its cars would have gotten in a crash.

The head of the company's self-driving car project said that while the results are encouraging, they also show the technology has yet to reach his goal of not needing someone behind the wheel.

"There's none where it was like, 'Holy cow, we just avoided a big wreck,'" said Chris Urmson, Google's self-driving car project leader.

"We're seeing lots of improvement. But it's not quite ready yet," Urmson said. "That's exactly why we test our vehicles with a steering wheel and pedals."

The California Department of Motor Vehicles, which is writing new regulations for the technology, said it was still reviewing in the reports.

Google reported 272 cases in which the cars' software or onboard sensors failed. Though Google did not release detailed scenarios, the problems included issues with the self-driving cars seeing traffic lights, yielding to pedestrians or committing traffic violations. There were also cases where intervention was needed because other drivers were reckless, and several dozen instances of an "unwanted maneuver" by Google's car.

Bryant Walker Smith, a professor at the University of South Carolina who closely follows self-driving car developments, said Google's rate of potential collisions was "not terribly high, but certainly not trivial." He said it remains difficult to gauge how self-driving cars compare to accident rates among human drivers, since even the best data underreport minor collisions that are never reported to authorities.

While Google's problem rate is "impressively low," a trained safety driver should remain in the front seat, said Raj Rajkumar, an engineering professor at Carnegie Mellon University who specializes in self-driving cars.

According to data in Google's report, a driver typically took control within one second of the car asking for help.

Drivers at other companies often reacted quickly as well, according to their reports, though Volkswagen Group of America reported that, in one case, it was more than 12 minutes before the person took control of one of its test Audis. Audi of America spokesman Brad Stertz said he was gathering details on the incident, but believed it was a software glitch that did not affect public safety, and possibly was a false reading.

John Simpson, a frequent critic of Google who focuses on privacy issues for the nonprofit group Consumer Watchdog, said the company's report "underscores the need for a driver behind the steering wheel capable of taking control of the robot car."

Google has argued to California regulators that once the company concludes the cars are ready for the public to use, they should not need a steering wheel or pedals because human intervention would actually make them less safe.

Google released its report Tuesday before the agency posted reports from other companies in what Google described as an effort to be transparent about its safety record. The company had lobbied against having to report disengagements in the first place, saying the data could be misinterpreted.

The other companies testing self-driving cars on California streets are Tesla Motors, Mercedes-Benz, and parts suppliers Bosch and Delphi.

Google's testing mostly involves driving around the company's Silicon Valley headquarters or the streets of Austin, Texas. The company's rate of human intervention has improved in recent months, according to its data, but Urmson cautioned that the rate might again rise as Google subjects the cars to more challenging environments and weather conditions.

Google said its cars would have been responsible in eight of the 11 avoided accidents, according to computer modeling the company performed later. In two other cases, its cars would have hit a traffic cone.

Google cars have been involved in nine collisions since September 2014. In each case, the other car was responsible, according to an analysis by researchers at Virginia Tech University.