Consumer Watchdog Tells DMV Not to Allow Robot Cars on Public Roads Until Feds Enact Safety Standards
Public Interest Group Backs Proposed California Regulations On Advertising and Privacy
SANTA MONICA, Calif., Oct. 19, 2016 /PRNewswire-USNewswire/ -- Consumer Watchdog today called on the California Department of Motor Vehicles to prohibit autonomous vehicles without a human driver capable of taking control until the National Highway Traffic Safety Administration (NHTSA) enacts enforceable standards covering the safety performance of robot cars.
NHTSA has proposed a voluntary safety checklist that contains no enforceable standards. The proposed DMV rules would require manufacturers to submit that federal checklist before testing or deploying robot cars. Consumer Watchdog testified that the checklist is inadequate to protect public safety on the roads, and that DMV must therefore prohibit driverless cars until enforceable Federal Motor Vehicle Safety Standards are in place.
"The proposed DMV rules would let robot cars without a driver on our roads if the manufacturer simply answers Yes, No or Maybe to each point on NHTSA's 15-point safety checklist," said Carmen Balber, executive director of Consumer Watchdog. "Absolutely no safety performance standards are required. We need more than a safety checklist written on toilet paper before we are sure driverless vehicles are safe to operate on public roads in California. That's why we're calling on the DMV to hold until federal regulators enact enforceable safety standards for driverless cars."
The nonpartisan, nonprofit public interest group applauded provisions of the DMVs new proposed autonomous vehicle regulations that would protect consumer privacy and limit advertising in vehicles using automated technology. The regulations were the subject of a public workshop today in Sacramento, at which NHTSA regulators testified.
The proposed DMV regulations offer consumers important privacy protections, said Consumer Watchdog. The rules would require that a manufacturer disclose how any information gathered by the vehicle would be used for purposes other than navigating the car, and require that consumers give consent to additional data collection. Use of the car could not be contingent on consumer consent.
The regulation's advertising provision would prohibit manufacturers of cars that use only some automated technologies from advertising their vehicles in a way that would leave the impression that a car is fully autonomous when it is not. The regulation specifically cites "self-driving," "automated," and "auto-pilot" as cause for concern.
"Tesla used humans as guinea pigs, killing at least two, while hyping its 'autopilot' technology. The proposed regulation would stop such abuses," said John Simpson, Consumer Watchdog Privacy Project Director.
Current California DMV regulations cover the testing of autonomous vehicles in California and require a licensed test driver who can take control when the robot technology fails. Another key requirement of that regulation is that manufactures report all crashes involving their robot cars. They must submit an annual "disengagement" report detailing all the times that the autonomous technology failed. Google, for example, reported its technology failed 341 times. There were 272 times that the software turned over control to the driver and 69 times when the driver felt compelled to override the robot system.
Consumer Watchdog said the regulations should be tweaked to require disengagement reports on a quarterly basis and that video and technical data associated with any crash should be made public. Police should investigate all crashes.
Last month NHTSA issued its 112-page "Federal Automated Vehicles Policy." A key element in the guidance is a 15-point safety assessment that manufactures are asked to complete. The assessment has no standards, but merely asks manufactures to say how they deal with such issues as where the robot car is supposed to operate, perception and response functionality of the robot technology, privacy, cybersecurity and ethical issues. At this point responding to the 15-point safety assessment is completely voluntary, though NHTSA says it plans a rule making to require a response.
"Requiring a response is completely inadequate," said Simpson. "There need to be real enforceable Federal Motor Vehicle Safety Standards that apply to robot cars. If NHTSA is sincere about driving technology, it should start a rulemaking immediately."
Under current NHTSA regulations so-called level 3 autonomous vehicles with a driver who can take over when the robot technology cannot handle the situation could be deployed on the nation's highways. Level 4 or Level 5 robot cars with no steering wheel or pedals cannot be legally deployed unless NHTSA grants an exception because the vehicles would violate current Federal Motor Vehicle Safety Standards.
Visit Consumer Watchdog's website at: www.consumerwatchdog.org
SOURCE Consumer Watchdog
Related Links
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article