New California Autonomous Vehicle Rules Make Video Game Out of Testing Robot Cars, Threatening Highway Safety, Consumer Watchdog Says
SANTA MONICA, Calif., Feb. 26, 2018 /PRNewswire-USNewswire/ -- New California autonomous vehicle regulations released today by the Department of Motor Vehicles will turn testing robot cars into a deadly video game that threatens highway safety, Consumer Watchdog warned.
"A remote test operator will be allowed to monitor and attempt to control the robot car from afar," said John M. Simpson, Consumer Watchdog Privacy and Technology Project Director. "It will be just like playing a video game, except lives will be at stake."
Previous regulations required a test driver in the robot car capable of taking over when the self-driving technology failed.
The new regulations also provide for the ultimate deployment of robot cars. Consumer Watchdog said that those rules fail to require public reports of all crashes involving autonomous vehicles.
"Crash reports and disengagement reports give the public a window into the state of self-driving technology," said Simpson. "It's essential that information be disclosed when robot cars are ultimately deployed for public use on our highways."
Consumer Watchdog said that mandatory "disengagement reports" filed with the DMV show the self-driving technology is not safe enough to be deployed without a human driver in the car capable of taking control when necessary.
Twenty companies with permits to test robot cars in California were required to file "disengagement reports", covering 2017 listing miles driven in autonomous mode and the number of times the robot technology failed. The reports were released Jan. 31. Nine of those companies, including Waymo (a subsidiary of Google's parent company) and GM Cruise, offered specific data showing reasons their robot technology failed.
Read the 2017 disengagement reports here: https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2017
Waymo said that its robot car technology disengaged 63 times, or once every 5,596 miles because of deficiencies in the technology and not "extraneous conditions" such as weather, road construction, or unexpected objects, as often presumed. The most common reasons why human test drivers had to take control of a robot car were deficiencies in hardware, software, and perception, Waymo's report said.
GM's Cruise division, which claims it will put robot cars on the road for public use in 2019, logged the second-most miles of the companies that were required to report on their testing. Its cars drove, a total of 131,675 miles and had 105 disengagements or one every 1,254 miles.
GM Cruise's report revealed that its robot cars cannot correctly predict the behavior of human drivers, as 44 out of the 105 disengagements (about 40%) in which a driver took control were cases where GM Cruise's technology failed when trying to respond to other drivers on the road.
All other companies that released specific data detailing reasons for the disengagements, including Nissan and Drive.ai, a technology startup partnered with Lyft, confirmed Waymo's and GM Cruise's experiences. Nissan said it tested five vehicles, logged 5,007 miles and had 24 disengagements. Meanwhile, Drive.ai had 151 disengagements in the 6,572 miles the company logged.
Visit our website at www.consumerwatchdog.org
SOURCE Consumer Watchdog
Related Links
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article