The media is breathlessly reporting that an inebriated Tesla worker died while his car was in “Full Self-Driving” beta mode, but Elon Musk says that just isn’t so.
Hans von Ohain, an employee of Tesla, died in Colorado in 2022 while driving his car which allegedly had FSD engaged. If the assertion is true, he could be the first fatality caused by “self-driving” software. That’s a big “if.”
Referring to Electek’s author Fred Lambert, Tesla Owners Silicon Valley posted on X, “Fred Lambert is spreading fake news. The person in question was three times above the legal limit of alcohol, which means you can’t intervene if necessary. Also, it hasn’t been confirmed whether FSD (Full Self-Driving) was engaged.”
Telsa founder and chief honcho Elon Musk clarified the incident even further on X, “He was not on FSD. The software had unfortunately never been downloaded. I say ‘unfortunately’, because the accident probably would not have happened if FSD had been engaged.”
He was not on FSD. The software had unfortunately never been downloaded. I say “unfortunately”, because the accident probably would not have happened if FSD had been engaged. — Elon Musk (@elonmusk) February 14, 2024
The Washington Post broke the story that the employee was killed on May 16, 2022, when his Tesla Model 3 slammed into a tree and caught fire in Evergreen, Colorado. Ohain was a recruiter at Tesla and a devoted fan of Musk. There was a passenger in the car, Erik Rossiter, who survived the crash.
When Rossiter called 911, he told dispatchers that Ohain had activated an “auto-drive feature” on the car. He claimed the feature caused the car to veer off the road by itself and crash into the tree. Rossiter would later state in an interview with the Washington Post that he “believed” that Ohain had engaged Tesla’s “Full Self-Driving” feature when the crash occurred.
“Full Self-Driving is Tesla’s most advanced driver-assistance technology, designed to guide the vehicle on roads from quiet suburbs to busy cities with little input from the driver. Over 400,000 Tesla owners have access to the FSD software, which remains in ongoing beta testing,” Breitbart reported.
“If Rossiter’s account proves true, this would likely be the first known fatality involving Full Self-Driving. In late 2021, federal regulators began requiring automakers to report crashes involving driver-assistance systems. Since then, they have logged over 900 crashes in Tesla EVs, including at least 40 serious or fatal injuries. Most crashes involved Tesla’s simpler Autopilot system,” the outlet added.
Tesla Full Self-Driving Beta 12.1.2 Drives from SFO to Tesla San Francisco with Zero Interventions Watch in 4K: https://t.co/dYs1eLRkDc Raw 1x speed footage: https://t.co/txgDqpfICIpic.twitter.com/T2kqV71rLs — Whole Mars Catalog (@WholeMarsBlog) February 11, 2024
Per the police report on the accident, there were no skid marks on the road at the crash site. That would mean the driver did not use the brakes before smashing into the tree. It also states that the car continued to power the wheels even after impact.
“An autopsy showed von Ohain had a blood alcohol level over three times the legal limit. Experts say this level of intoxication would have seriously hampered his ability to maintain control. However, the sophisticated self-driving capabilities von Ohain believed were engaged may have given him undue confidence in the car’s ability to correct itself,” Breitbart elaborated.
Ohain’s widow is pointing the finger at Tesla, according to Business Insider, “Nora Bass, Ohain’s widow, told the Post her husband believed in Elon Musk’s vision for the future of autonomous vehicles so much that he was willing to tolerate the ‘jerky’ experience to help improve Tesla’s self-driving technology.”
Tesla hasn’t claimed to have full self driving capabilities, you’re still required to be in control and mindful of your surroundings — Spadez.sol {369} (@RealAceSpadez) February 13, 2024
“But since Tesla has so far been silent about Ohain’s death, she told the outlet she felt she and her husband were ‘just guinea pigs’ ironing out the kinks in the tech with a false promise of safety,” Business Insider reported.
There have been a number of recalls recently where Tesla software needed to be updated. There have also reportedly been complaints concerning the FSD software over sudden swerving or braking.
Lawsuits have been filed attempting to hold Tesla responsible if its technology causes crashes or fails to prevent them. The company contends that drivers should be responsible enough to stay in control of their vehicles and be alert to anything that could go wrong.
“Autopilot, Enhanced Autopilot, and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous,” Tesla’s disclaimer concerning the beta program advises.