Tesla tempted drivers with 'insane' mode and now is tracking them to judge safety. Experts say it's ludicrous.

SAN FRANCISCO - Tesla pioneered driving modes called "Insane" and "Ludicrous" that maxed out the cars' acceleration, practically encouraging drag racing. The cars' automated systems have introduced new behaviors, too - some of them impolite, such as not reacting to signaled lane changes, or taking sharp curves at speed.

Now, Tesla wants its drivers to be nice. And it started testing them on it late last month.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

Tesla is expanding its Full Self-Driving software beta this month, the most advanced suite of Tesla's driver-assistance features. To qualify for it, drivers must first agree to let Tesla monitor their driving - an apparent effort to ensure it only goes to the safest road users - scoring them on five categories based on data collected by their cars, including when they are driving in Autopilot. For instance, they are penalized for braking too hard. The rollout of the upgrades will be staggered in descending order of score, from 100 downward, Tesla CEO Elon Musk said on Twitter.

Musk laid out the terms of the sweepstakes: "FSD Beta (version) 10.2 rolls out Friday midnight to ~1000 owners with perfect 100/100 safety scores," he said. Early Saturday morning, Musk said there were a "few last minute concerns," and the build would need to be delayed; later, he said it would arrive Sunday night.

One of the reasons Tesla wants good drivers to have access to Full Self-Driving is because humans help train its software on how to drive. Bad human driving habits - like cutting people off, rolling through stop signs and tailgating - can be baked into the software, potentially replicating those habits across many more communities. It's another way Tesla is redefining the meaning of car ownership, turning the relationship into a two-way street where in exchange for giving out new features, it unleashes thousands of drivers on public roads as beta testers. Those users are changing standard road behaviors and - if all goes to plan for Tesla - ushering in an autonomous future in the process.

Even Musk previously admitted Full Self-Driving isn't completely ready. He said on a company earnings call this summer a subscription was a "debatable" proposition for consumers, adding, "We need to make Full Self-Driving work in order for it to be a compelling value proposition."

Tesla did not respond to a detailed request for comment.

It has been a learning curve for drivers, some of which have taken to Twitter to complain. The criteria on which they're being judged includes hard braking; some say they are limiting the use of their brakes. A Consumer Reports evaluation said the score system "could lead to unsafe driving."

Other drivers say the system has cut into their habits of flooring the cars from intersections, tailing others closely and barreling through turns. But some drivers complained the system is too sensitive.

"The lesson of Safety Score Beta is to gun every yellow light," wrote one Twitter user. If the car's regenerative braking doesn't slow you down first - recapturing energy and slowing the car - "you're gonna get dinged for hard braking."

Others have complained of being unfairly penalized for abiding by traffic rules for example, a seeming lack of "contextual awareness" in another user's experience.

"I'm super annoyed," the user wrote to Tesla. "7 straight perfect days and today a light turned yellow just ahead and I had to tap the brakes ever so gently. ... Braking was safer than running the light."

The new rollout of Full-Self Driving has prompted concerns for regulators and safety investigators. National Transportation Safety Board Chair Jennifer Homendy told The Washington Post last month she was concerned Tesla was rolling out new features without addressing the Board's prior safety recommendations, which included limiting the use of automated features to the conditions for which they were designed and developing better driver monitoring.

Some criticized Tesla for foisting another experiment on the public.

"They outsourced the testing to total nonprofessionals and fans, of course they're going to figure out how to hack the scoring system and cover up the AI's flaws," said Joshua Brian Irwin, attorney for Bernadette Saint Jean, whose husband was killed on the Long Island Expressway in July when a Tesla believed to be using automated features struck him on the side of the road.

Even if they don't gain access to the beta, Tesla drivers can still use the driver-assistance system called Autopilot, the software that navigates cars from highway on-ramp to off-ramp and can park and summon the vehicles to their owners. Full Self-Driving brings that suite to neighborhoods and city streets, allowing drivers to navigate their cars from Point A to Point B, ideally with no driver interventions.

The cars' Autopilot software acts differently than the typical driver might: it doesn't automatically react when someone has their blinker on to try to merge, for example, and won't let them in until it detects them moving into the lane. By the same token, Teslas in Autopilot mode tend to hug the side of the road in the passing lane, even if someone is trying to get through. Those habits may be commonplace in an increasingly automated driving future.

Tesla Model Y owner Peter Yu, an AI researcher based in Detroit, describes some of Autopilot's behaviors as "unsettling."

"It drives a little different than how a human would," he said. "It's not as defensive."

He described a situation he experiences "daily," where the car takes curves on the highway at full speed, even if the vehicle in the next lane is a semi-truck that might veer into his right of way.

"The car doesn't really take this into account," he said. "Basically it'll just say, 'hey I'm just gonna go.' ... It's just very uneasy for me."

Tesla does not require testing to gain access to its Autopilot software, which contains driver-assistance features that require the operator to pay attention at all times. There have been several fatal crashes while the system was activated, including one that killed Apple engineer Walter Huang in 2018 when his Tesla Model X veered into a concrete barrier. The NTSB cited overreliance on Autopilot and ineffective driver monitoring in the crash.

The National Highway Traffic Safety Administration is investigating Autopilot - the less advanced iteration of Tesla's driver-assistance - over around a dozen crashes involving parked emergency vehicles while Autopilot was engaged.

Ultimately, Tesla wants to use Full Self-Driving to unleash its ambition of a million robo-taxis on the road - bringing the long-promised autonomous future into reality. But the cars lack the hardware and sensor suites typically used by companies deploying autonomous cars. The systems are regarded merely as a further iteration of driver-assistance, which requires a driver to pay attention at all times.

And Tesla is keenly aware of the potential for incidents, including dangerous and high-profile crashes, that could implicate the software and set back its ambitions.

Now it is scoring drivers on five factors: hard braking, aggressive turning, unsafe following, forward collision warnings and deactivations of Autopilot, which can happen when drivers fail to indicate they are paying attention.

But drivers say they often drive differently in their Teslas than in other cars. Tesla Model S owner Stefan Heck says his car delivers peak acceleration practically the moment he touches the throttle, unlike gas cars that have to spool up their engines and search for the right gear. That enables him to pass other drivers quickly, to be able to change lanes and catch a green light.

When he borrows his wife's Lexus hybrid, she says, "'you've gotta get out of your Tesla habits,'" said Heck, founder and CEO of a company called Nauto that provides software and hardware tools to help commercial drivers improve their safety.

Tesla's scoring metric, said Heck, biases the vehicles against owners in San Francisco and New York City, who will have to resort to certain maneuvers in heavy traffic, for example, or less regularly encounter open roads.

"If I buy a Tesla in - let's say - Kansas and I buy a Tesla in Manhattan and I just go on the road in normal conditions, normal things I would do, I'm going to end up with a lower score in Manhattan than I would in Kansas," said Heck.

And it fails to seriously take into account two major predictors of crashes, he said, distractedness and speeding.

One former Tesla employee who worked on Autopilot, who has access to Full Self-Driving but spoke on the condition of anonymity because he is bound by a nondisclosure agreement governing the release of information on it, said the safety score has changed how he drives - for better or worse.

"In order to get a 100 you can't floor your gas pedal, you've got to turn your signal on when you change lanes, you can't follow too close," he said. "I have slowly stopped myself braking hard just to see if it will affect the score."

But the software, he said, too readily defines turns as "aggressive," especially for densely populated areas and places with hairpin turns as opposed to wide and empty stretches of road.

Mohammad Musa, founder of Deepen AI, which aims to helps companies launch safe driver-assistance and autonomous driving systems, says Tesla has the right idea - even if the score will need to be tweaked over time.

"It's a step in the right direction," said Musa. "The more that we can quantify the behavior and also even encourage the correct behavior from people - it's like gameifying the process. If you know that you're being scored, you're more likely to behave the right way."

Related Content

The education of Frances Haugen: Facebook whistleblower learned to use data as a weapon from years in tech

One state has never taken in refugees. Will it welcome Afghans?

Can the new 'Fauci' documentary find an audience after so much pandemic fatigue?