Tech
Yet another fatal Tesla Autopilot crash is under federal investigation
273
Crashes involving Tesla's Autopilot since July 2021.
A Tesla driver using the Autopilot driver-assist program crashed into a motorcycle in Utah this past weekend, killing the cyclist instantly. The National Highway Traffic Safety Association (NHTSA) is already investigating the fatal crash, bringing the total number of federal Autopilot investigations up to 39.
If this headline sounds a bit too familiar, it’s not because you’re experiencing déjà vu — we’ve been here before, and not all that long ago. Just a few weeks ago, the NHTSA opened a special probe into a fatal Tesla crash in California that killed a pedestrian. In that case, too, the driver was reportedly using Autopilot.
As The Verge notes, the NHTSA’s Special Crash Investigations (SCI) currently lists 48 crashes under investigation, and 39 of those crashes involve Teslas. Those crashes have killed 19 people. And still Tesla forges ahead with its plans to expand Autopilot into a fully autonomous driving system.
And that’s not even all of them — The 39 crashes being investigated are only a small portion of those involving Autopilot. According to the NHTSA, 273 crashes involving Teslas running Autopilot occurred between July 20, 2021, and May 21, 2022.
Yes, we know, car crashes happen every day — but Autopilot promises to make roads safer, and thus far it’s proving to do quite the opposite. Another 16 crashes involving stationary vehicles are being investigated by other parts of the NHTSA; many of these happened at night when Autopilot might struggle more to interpret visual cues. This investigation was upgraded to “engineering analysis” status in June, which involves more scrutiny and can lead to a recall.
We’re just not ready for this — The NHTSA has yet to make any official judgments on Autopilot, but suffice it to say the sheer number of investigations is worrisome. It’s clear from its many errors — some fatal — that Autopilot is not exactly road-ready.
Telsa Technoking Elon Musk doesn’t think so, though. In fact, Musk is going a few steps further, pushing for the fully autonomous version of Autopilot (known as the Full Self-Driving software) to be developed as quickly as possible. While FSD is still very much in beta mode and is open to only a select number of Tesla enthusiasts, it’s already putting drivers’ and pedestrians’ lives in danger.
With dozens of federal crash investigations, you might think a car company would fix its existing technology before moving onto much more complex software. But this is Tesla we’re talking about.