Is a Tesla safer with autopilot? That’s a challenging question to answer, because it’s not just about statistics — it’s about human psychology.
In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
As I’ve written before, the challenge with these statistics is that the bases don’t compare: you can only use Autopilot in highway driving, while only 17% of crashes occur on highways. Teslas are also generally newer than other cars on the road, which means you’re comparing Teslas to a bunch of cars without the most modern safety features.
But it’s very clear that Tesla’s Autopilot and other safety features prevent crashes that would otherwise take place. Let’s assume that at some point in the future, we can see a legitimate study that uses comparable bases and prove that, say, a Tesla on Autopilot is twice as safe as a car with similar safety features, but without automated driving.
Does that mean we should just turn the roads over to cars with features like Autopilot?
The psychology of automated driving
People who drive cars do all sorts of stupid things. They drive with a hamburger in their hand. They read email and respond to text messages. They drive aggressively, swerving in and out traffic like a race car driver. I just drove from Boston to Philadelphia, a distance of 300 miles, and I saw all of these things — along with the car shown in this post that had somehow crashed on the Delaware River Turnpike Toll Bridge and was burning brightly like a doomed gas station in a disaster movie. (A passenger took the photo; I was busy driving.)
But there are two reasons that Autopilot and features like it won’t go mainstream in the near future. Neither has to do with technology. Both have to do with psychology.
First, human beings, given power by safety technology, behave more recklessly. When we get better football helmets, we smash into each other with more force and don’t worry about our brains so much. When we get seat belts and airbags, we don’t worry so much about car crashes. And when we have Autopilot — even though we are warned to pay attention, our minds wander. We are more prone to do other things when we should be paying attention the road. Every safety advance is accompanied by an increase in reckless behavior, which reduces the value of the safety advance.
And second, people aren’t rational. Look at the types of crashes Teslas on Autopilot get into. According to the New York Times, Teslas on Autopilot were involved in fatal crashes where they smashed into tractor-trailers crossing their paths or smacked into a concrete barrier. In each of these cases, the Tesla failed to brake for an obstacle that any normal driver would know to avoid.
Each of us believes we are a good driver. We believe we know our limitations, for example, when we can take our eyes off the road to check our phones safely. And we know in a crash situation, we believe we will take whatever action is necessary to avoid the crash. We certainly know enough not to crash into a concrete barrier or a truck crossing the road.
This confidence is misguided, but it is human.
That is why even if Autopilot is safer, people will trust themselves. Because they rate the risk of making their own mistakes lower than the Autopilot, and because the Autopilot’s mistakes seem so obviously avoidable.
The Tesla’s crash avoidance features are great. Changing lanes is much easier in a Tesla, for example, since it warns you when it detects a car in a space you’re about to occupy. Other cars have similar features that can, for example, help keep you from straying out of your lane and warn you when a pedestrian is crossing your path.
I think people will welcome these features, just as they embraced the small miracle that is the backup camera. But we’re not ready to trust Autopilot, even when it gets good enough that we ought to. That’s just human nature.