Remain seated
Nobody was seated in the driver's seat of a Tesla that crashed, killing two
The 2019 Model S was driving at high speeds when it crashed into a tree.
Two men were killed in Texas on Saturday night after the Tesla Model S they were seated in swerved off the road and crashed into a tree. Neither was in the driver’s seat as the car went around a curve “at a high rate of speed,” according to authorities. It took firefighters more than four hours to put out a battery fire caused by the impact.
That the two felt confident in leaving the driver’s seat completely unoccupied represents a stunning level of brashness, but it’s not unheard of from Tesla owners. Bold claims by CEO Elon Musk have instilled faith in his most avid followers, who have tested the limits of Tesla’s Autopilot driver assistance feature, including by taking naps and playing video games as the cars barrel down roadways.
An official report indicates that the men’s wives saw them leave in the vehicle, and that they were talking about the Autopilot feature. The National Transportation Safety Administration (NHTSA) recently opened dozens of investigations into crashes that may have involved Autopilot. It’s not clear whether the car in this instance had Autopilot engaged.
Driver assistance — CEO Elon Musk has said that sometime in the near future, Autopilot will allow Tesla’s vehicles to operate without human intervention. The Full Self-Driving (FSD) package released last year allows owners to unlock more advanced autonomous capabilities, though its reliability is questionable.
The FSD software includes a system warning that it could “do the worst thing at the worst time,” but Musk’s own confidence has belied that warning to make it look like a way for Tesla to indemnify itself from liability.
“FSD beta build V8.1 normally drives me around with no interventions,” Musk wrote on Twitter last month. “Next version is a big step change beyond that. Tesla is solving a major part of real-world AI. This is not widely known.” Recently videos have shown that FSD struggles on real-world city streets, where the software battles to deal with unpredictable lane markings and pedestrian behavior.
Watch the road — Musk admitted in an interview late last year that the biggest problem with Autopilot is drivers becoming too complacent as the software improves. “There is a dangerous transition point,” he said. “Where self-driving is good, but it occasionally has issues, because people maybe get too comfortable, and then they stop paying attention like they should. And then 99.9% of the time, it's good, 1 in 1,000 times it's not.” Tesla has received criticism for releasing its self-driving technology to consumers before it’s ready for prime time, effectively using customers as beta testers in the process.
Musk believes that FSD could reach Level 5 autonomy (which doesn’t require human attention) within the next year, though he has been forced to push back that timeline repeatedly over the years.