By Hyunjoo Jin
SAN FRANCISCO (Reuters) -A Tesla Model S car was in “Full Self-Driving” mode when it hit and killed a 28-year-old motorcyclist in the Seattle area in April, police said, making it at least the second accident involving the technology on which Tesla CEO Elon Musk is pinning his hopes.
The 56-year-old driver was arrested on suspicion of vehicular homicide based on his admission that he was looking at his cell phone while using the driver assistant feature, the police said in a statement.
Tesla says its “Full Self-Driving (Supervised)” software requires active driver supervision and does not make vehicles autonomous.
Previously, the National Highway Traffic Safety Administration (NHTSA) said there was one fatal accident involving a Tesla vehicle using FSD software between August 2022 and August 2023.
That case remains under investigation, but experts say there are limitations to Tesla’s technology dependent on cameras and artificial intelligence. Tesla’s rivals such as Alphabet’s Waymo also uses expensive sensors like lidars to detect the driving environment.
“There are so many things that can go wrong” with Tesla’s camera-only system, said Guidehouse Insights analyst Sam Abuelsamid. For instance, he said it can inaccurately measure how far away an object is.
“It is extremely challenging to collect and curate data from all sorts of real-world elements such as motorcycles and bicycles in the bread range of possible weather, lightning, road and traffic conditions,” said Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University.
This year, Musk shelved Tesla’s all-new affordable cars and increased his bets on self-driving vehicles, saying he will be shocked if Tesla cannot achieve full self-driving capability next year.
Speaking in an interview with the Tesla Owners of Silicon Valley club last weekend, he said a future vehicle will be like a “tiny mobile lounge” where drivers will be able to watch movies, play video games, work and even drink and sleep.
Musk has been aiming to achieve self-driving capability for several years, with the technology under growing regulatory and legal scrutiny.
The regulator began a probe of Autopilot in August 2021 after identifying more than a dozen crashes in which Tesla vehicles had hit stationary emergency vehicles, and reviewed hundreds of crashes involving Autopilot.
In December 2023, Tesla was forced to recall nearly all its vehicles on U.S. roads to add safeguards to the software.
(Reporting by Hyunjoo Jin; Editing by Kevin Liffey and David Gregorio)