We have trouble trusting machines. Ask a Tesla owner.

Is a Tesla safer with autopilot? That’s a challenging question to answer, because it’s not just about statistics — it’s about human psychology.

According to Tesla’s statistics for Q1 of 2021, for every mile driven with a Tesla on Autopilot, you’re half as likely to have an accident as without the autopilot feature. Here’s the relevant data:

In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

As I’ve written before, the challenge with these statistics is that the bases don’t compare: you can only use Autopilot in highway driving, while only 17% of crashes occur on highways. Teslas are also generally newer than other cars on the road, which means you’re comparing Teslas to a bunch of cars without the most modern safety features.

But it’s very clear that Tesla’s Autopilot and other safety features prevent crashes that would otherwise take place. Let’s assume that at some point in the future, we can see a legitimate study that uses comparable bases and prove that, say, a Tesla on Autopilot is twice as safe as a car with similar safety features, but without automated driving.

Does that mean we should just turn the roads over to cars with features like Autopilot?

The psychology of automated driving

People who drive cars do all sorts of stupid things. They drive with a hamburger in their hand. They read email and respond to text messages. They drive aggressively, swerving in and out traffic like a race car driver. I just drove from Boston to Philadelphia, a distance of 300 miles, and I saw all of these things — along with the car shown in this post that had somehow crashed on the Delaware River Turnpike Toll Bridge and was burning brightly like a doomed gas station in a disaster movie. (A passenger took the photo; I was busy driving.)

But there are two reasons that Autopilot and features like it won’t go mainstream in the near future. Neither has to do with technology. Both have to do with psychology.

First, human beings, given power by safety technology, behave more recklessly. When we get better football helmets, we smash into each other with more force and don’t worry about our brains so much. When we get seat belts and airbags, we don’t worry so much about car crashes. And when we have Autopilot — even though we are warned to pay attention, our minds wander. We are more prone to do other things when we should be paying attention the road. Every safety advance is accompanied by an increase in reckless behavior, which reduces the value of the safety advance.

And second, people aren’t rational. Look at the types of crashes Teslas on Autopilot get into. According to the New York Times, Teslas on Autopilot were involved in fatal crashes where they smashed into tractor-trailers crossing their paths or smacked into a concrete barrier. In each of these cases, the Tesla failed to brake for an obstacle that any normal driver would know to avoid.

Each of us believes we are a good driver. We believe we know our limitations, for example, when we can take our eyes off the road to check our phones safely. And we know in a crash situation, we believe we will take whatever action is necessary to avoid the crash. We certainly know enough not to crash into a concrete barrier or a truck crossing the road.

This confidence is misguided, but it is human.

That is why even if Autopilot is safer, people will trust themselves. Because they rate the risk of making their own mistakes lower than the Autopilot, and because the Autopilot’s mistakes seem so obviously avoidable.

The Tesla’s crash avoidance features are great. Changing lanes is much easier in a Tesla, for example, since it warns you when it detects a car in a space you’re about to occupy. Other cars have similar features that can, for example, help keep you from straying out of your lane and warn you when a pedestrian is crossing your path.

I think people will welcome these features, just as they embraced the small miracle that is the backup camera. But we’re not ready to trust Autopilot, even when it gets good enough that we ought to. That’s just human nature.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

3 Comments

  1. It really is about the wider question of trust in technology. With so much failure in our every-day devices (I truly wonder how most people get along when there is a glitch – but that is a topic for another time), it is no wonder most people are skeptical about Autopiloted vehicles.

    This topic is timely, as with Virgin Galactic voyage yesterday. I noted, they did emphasize at a couple points that they had pilots for the Unity (rocket driven) vehicle (Eve – the double fuselage jet carrier – had two pilots as well).

    This is in significant contrast to Musk’s SpaceX approach. For Branson, always a marketer, it is very likely from his marketing insight to the same psychology you talk about.

    Branson is aiming at tourism. Musk is more about payload and cost efficiency. Found this that raises that question…

    “SpaceShipTwo’s core business will be to take passengers into sub-orbital space, so of course having human pilots onboard makes sense as eventually there will be human passengers. Would you be prepared to board a commercial jet now if you knew their were no human pilots onboard? Knowing that the pilot’s own safety is twinned with that of the passengers means we feel safer with experts at the helm.”
    https://theconversation.com/spaceshiptwo-cost-a-life-so-why-do-we-still-use-human-test-pilots-33728

    Like Musk’s aim, we will probably see a much higher adoption rate for cargo hauling vs for passenger vehicles.

    Still, eventually drones will overtake cars as personal vehicles, for much of our transport. I don’t trust humans to be piloting that airspace, so we will all have to trust autopilot systems.

  2. My bias against automated driving comes not from UNfamiliarity with technology, but through10 years of working in software development: I’ve never once seen a release without major bugs, & the stakes are too high & margin of error too small on our busy highways.

  3. I wouldn’t go so far as to say “the Tesla crash avoidance features are great.” Not when their automated emergency braking system disregards stopped object in front of them, as has been demonstrated repeatedly on the road, with fatal results. Because their radar only detects indistinct “blobs” they would have a high false alarm rate triggering on billboards on a curve, overpasses, etc. So they ignore stationary objects as clutter. Unfortunately, the same applies to stopped fire engines and tractor trailers pulling out at a 90 degree angle to the direction of the Tesla’s motion (giving the longitudinal speed component of zero).

    I agree that people do stupid things that partially or entirely offset safety benefits some of the time, and that people trust themselves more than machines. But that leaves out an important 3rd component. Humans get bored and are not particularly good at repetitive tasks, We’re even worse at being continuously vigilant when nothing happens for long stretches. Tesla’s so-called self-driving system takes over routine driving but requires complete human driver vigilance and instant response in those cases where it falls short. Humans aren’t built that way, and it’s not a factor of carelessness or recklessness.

    Tesla has the “interesting” business model of allowing their owners to pay for the privilege of beta-testing safety of life features on the open road, involving fellow travelers, whether in cars, trucks, bikes, or walking, to be unwitting subjects in these tests.

    p.s., while it is hard to get good comparative data, the most accurately collected data is for fatalities, and at least one analysis, albeit with limited data, shows that Teslas are significantly worse than other luxury cars in terms of driver death rates per year. https://medium.com/@MidwesternHedgi/teslas-driver-fatality-rate-is-more-than-triple-that-of-luxury-cars-and-likely-even-higher-433670ddde17