Self-driving cars take traffic laws, such as stop signs or speed limits, literally and follow them to a T. Humans? Not so much.
And so as the public may fear the menace of rogue autonomous vehicles failing to recognize them and causing crashes, the reality is quite different: It's actually human drivers who are posing much of the risk, failing to fully halt at a stop sign or getting impatient with a slow-footed robot car and causing accidents with them, Bloomberg reports.
"They don't drive like people. They drive like robots. They're odd, and that's why they get hit."
The accidents typically occur at intersections rather than in free-flowing traffic, and at low speeds with no injuries. In California, the only state that requires reports when autonomous vehicles are involved in accidents, self-driving cars were rear-ended 13 times since the beginning of 2016, out of 31 collisions involving autonomous cars. The results have autonomous vehicle companies working on ways to get their vehicles to drive more naturally and intuitively with human-powered traffic.
"They don't drive like people. They drive like robots," Mike Ramsey, an analyst at Gartner who specializes in advanced automotive technologies, told Bloomberg. "They're odd and that's why they get hit."
Said Karl Iagnemma, CEO of the self-driving software developer NuTonomy: "You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it's acting very conservatively. This can lead to situations where the autonomous car is a bit of a fish out of water."
He added: "If the cars drive in a way that's really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration. What that's going to lead to is a lower likelihood that the public is going to accept the technology."
Just wait until these robot cars encounter the real menace: Teenage drivers.