By David Shepardson
WASHINGTON (Reuters) – U.S. auto safety regulators on Thursday said they were opening a formal regulatory proceeding that could eventually result in the adoption of new safety standards for autonomous vehicles.
The National Highway Traffic Safety Administration (NHTSA) said it was issuing an advance notice of proposed rulemaking to get public input on how to ensure the safety of future self-driving vehicles. Companies like General Motors Co, Alphabet Inc’s Waymo and Tesla Inc are working on vehicles that can drive themselves.
“This rulemaking will help address legitimate public concerns about safety, security and privacy without hampering innovation in the development of automated driving systems,” said U.S. Secretary of Transportation Elaine Chao in a statement.
NHTSA said the proceeding could result in the agency issuing new guidance documents addressing best industry practices, providing information to consumers or formal regulations including rules requiring reporting and disclosures to new legally binding safety standards on automated driving systems. Any final rules are still likely years away.
The agency said it is focused on key primary functions for self-driving systems including how use sensors; detect other road users; plans routes; makes decisions on how to respond appropriately to road users and execute driving functions.
NHTSA seeks input to develop “a framework that meets the need for motor vehicle safety and assesses the degree of success in manufacturers’ efforts to ensure safety,” it said.
The National Transportation Safety Board (NTSB) has faulted NHTSA for adopting what it called “a nonregulatory approach to automated vehicle safety” sand the agency has failed to develop a method for verifying manufacturers of “partial automation systems are incorporating system safeguards.”
On Thursday, NHTSA said it “has no desire to issue regulations that would needlessly prevent the deployment of any (automated-driving system)-equipped vehicle” adding “an ill-conceived standard may fail to meet the need for motor vehicle safety and needlessly stifle innovation.”
NTSB has criticized NHTSA in investigations of fatal crashes involving an Uber self-driving test vehicle and another involving a California driver killed using Tesla’s driver assistance system Autopilot.
There are also no regulations governing the performance of systems like Autopilot that allow drivers to keep their hands off the wheel for extended periods, but NHTSA can demand a recall if believes any vehicle poses an unreasonable risk to safety.
NHTSA’s special crash unit is investigating a dozen Tesla crashes in which it suspects Autopilot or some other advanced driver assistance system was in use.
(Reporting by David Shepardson; Editing by Chizu Nomiyama and Tom Brown)