Self-driving cars are oftentimes victims in hit-and-run incidents

SAN FRANCISCO — Around 1 a.m. on March 6, a driverless car from the tech startup Cruise was trying to make a left turn when it encountered something its algorithms probably couldn’t predict: An Infiniti Q50 performing “donuts,” a popular and unlawful pastime for some of the city’s night owls, in the middle of the intersection.

The two vehicles collided head-on, according to a report that the company later sent to state authorities. Cruise said its vehicle suffered moderate damage, but that no one was injured. The experimental car had no driver at the time — an increasingly common sight in San Francisco — as part of an ongoing test of late-night robotaxis.

Whether the Infiniti driver suffered any damage or injuries isn’t clear. They didn’t stick around.

It was the latest example of a pattern bedeviling tech companies that are trying to make driverless cars a reality: hit-and-run crashes seemingly caused by human drivers, according to a review by NBC News of collision reports filed with the California Department of Motor Vehicles.

The reports, which were written by employees of the tech companies, describe 36 instances in 2022 in which a person driving a car or truck left the scene of a crash involving their vehicle and an autonomous vehicle. The problem has continued at a similar pace this year, with seven examples as of early March.

Driverless cars operate in only a handful of fair-weather cities. Cruise, a subsidiary of General Motors, operates a nightly driverless taxi service in San Francisco, and Waymo, which shares a parent company with Google, has a similar service in Phoenix. Other companies including Apple, Mercedes-Benz and Amazon subsidiary Zoox are running tests in places such as California, Florida and Texas.

The hit-and-runs pose a problem for driverless technology and its future: Even when self-driving cars are programmed to do everything right, it can be hard to avoid the mistakes of human drivers.

“This is popping up more and more in San Francisco,” said Anderson Franco, a personal injury attorney in the city.

Franco said that people involved in a crash have an obligation to at least stop and exchange information. He said there are a variety of reasons why people might leave the scene instead, including that they don’t have insurance, they’re scared of the consequences or they don’t know how to contact Cruise or its competitor Waymo to exchange information.

“My best guess is that the drivers think they can’t be held liable,” Franco said.

“If you are operating your own vehicle and you crash into an autonomous vehicle, the correct thing to do is take photographs, call the police and have it documented,” he said.

But it’s not always clear from the outside of a Cruise or other autonomous vehicle (AV) what to do if there’s a problem. Cruise said in a statement to NBC News that it was in the process of making its phone number more prominently displayed on the outside of vehicles, so drivers in a crash know who to call. A Cruise vehicle seen Tuesday by NBC News had no such information displayed. Company employees can also connect to a car remotely and communicate with people on-scene through a speaker system.

“Most people want to do the right thing and exchange contact information but given interacting with an AV is novel for many people we want to ensure they have an easily identifiable contact number displayed on the outside of the AV,” a Cruise spokesperson said.

At least three hit-and-run incidents involving self-driving cars in San Francisco resulted in injuries, according to the collision reports. In one example last May, two Cruise workers reported back injuries after a BMW rear-ended their vehicle, which was in autonomous mode and stopped at a red light. In all three cases, the drivers of the other cars left the scene without exchanging information, the reports say.

In another case, a Cruise vehicle in autonomous mode with two company workers inside was rear-ended twice by a Honda driver in Golden Gate Park, according to a collision report. The Cruise car was stopped at a red light at around 3:53 a.m. on a Tuesday in August when the Honda driver bumped it from behind; then, the report continues, the Honda driver reversed backward several feet, stopped and drove forward again, making contact with the Cruise vehicle a second time. The two collisions damaged the back of the Cruise vehicle and injured the two Cruise workers inside, the report says.

“The driver of the other vehicle left the scene without exchanging information,” the report says.

The human drivers who have hit autonomous vehicles appear to be getting away with little accountability. Autonomous vehicles are usually equipped with a variety of external cameras that could record the license plate numbers of hit-and-run drivers but it’s not clear how often the companies have gone down that road.

Sgt. Adam Lobsinger, a spokesperson for the San Francisco Police Department, said he didn’t know if the department had pursued criminal charges in any of the hit-and-runs involving autonomous vehicles, but he said their policy is to look into the collisions.

“It is our current policy to document and investigate all collisions involving autonomous vehicles,” he said in a statement.

“Anyone involved in a collision with an autonomous vehicle should call 911 and remain on scene,” he added. The SFPD has its own three-page guide for how law enforcement should deal with a driverless vehicle.

Cruise said in a statement that the hit-and-runs are usually minor. It said it works with San Francisco police “when necessary” and searches its videos for the license plate numbers of other cars “if needed.” Cruise declined to comment on specific cases.

Waymo said it has kept its options open about how to respond to hit-and-runs.

“In instances when Waymo has been the victim of a crime, we may report the event to law enforcement,” the company said in a statement, adding that it may provide information such as photos to police. It declined to comment on specific cases.

As more driverless cars roll out, “it would be expected that the number of interactions with other vehicles will increase, based on statistical probability alone,” Waymo said.

In California, leaving the scene of a collision where there was damage can be prosecuted as a misdemeanor — or, if someone was injured, prosecuted as a felony punishable by up to five years in prison.

The California DMV, which regulates autonomous vehicles, said it was up to local law enforcement to determine if anyone has broken the law. The DMV added that, because of limited data available, “it’s unclear if the rate of hit-and-run incidents involving AVs is higher or lower than the rate involving conventional vehicles.”

The National Transportation Safety Board, which has investigated other safety issues with autonomous technology, said it has not looked at hit-and-runs.

Roger McCarthy, an engineering consultant in Palo Alto, California who has studied the collision reports of autonomous vehicles, said that human drivers are “overwhelmingly at fault” when there’s a collision between them and a driverless car such as a Cruise or Waymo. Many of the collisions are rear-end crashes, where the human driver misjudged whether the autonomous vehicle would proceed, he said.

“AVs don’t behave in an intuitive manner,” he said. For example, the vehicles don’t roll through stop signs and they obey speed limits.

But McCarthy said he’s sympathetic to the human drivers who are part of a real-world experiment — one that state regulators OK’d, but that drivers didn’t consent to. Last year, the San Francisco Examiner published a guide on what to do if you get in a collision with a driverless car, but the hit -and-run incidents have gone on.

“I suspect if you stopped the average driver and asked them, he wouldn’t have a clue what to do in that situation,” he said of collisions with a driverless car.

This article was originally published on