In an edited extract from their new book, The Tesla Files, published in The Guardian, Sönke Iwersen and Michael Verfürden examine concerns surrounding Tesla’s vehicle safety and data practices. The article highlights alarming incidents and raises questions about why Tesla has, in some cases, withheld crash data.
The report opens with the tragic account of Stefan Meier, who died in a Tesla Model S crash in 2018. His car caught fire and the doors reportedly could not be opened by rescuers. His widow, Rita Meier, remains without answers, despite a police investigation. Her husband’s vehicle data, though listed as “resolved” internally by Tesla, was not provided to authorities.
This incident, along with others, prompted the authors to investigate the “Tesla Files” – 23,000 leaked documents and 100 gigabytes of confidential data. These files, from an anonymous whistleblower, reveal over 2,400 customer complaints about unintended acceleration and more than 1,500 braking issues, including 139 instances of emergency braking without cause and 383 phantom braking events. Over 1,000 crashes are documented, alongside 3,000 driver-assistance incidents raising safety concerns.
The article details similar cases, like Anke Schuster, whose husband also died in a Tesla crash where vehicle data was not provided to investigators. These accounts contrast with Tesla CEO Elon Musk’s claims of immediate release of critical crash data and the company’s supposed superior data handling.
The Guardian piece also raises concerns about Tesla’s retractable door handles, a design feature reportedly linked to at least four fatal accidents in Europe and the US since 2018. In one German case, a court-appointed expert concluded that the failure of rear door handles to extend was a “decisive factor” in two teenage deaths. Despite this, Tesla reportedly shows no intention of altering the design.
Researchers from TU Berlin, who hacked Tesla’s autopilot hardware, discovered “Elon Mode”—a hidden setting for fully autonomous driving without driver supervision. They also found that while Tesla collects vast amounts of data, it may omit significant portions when responding to official requests, as indicated by a Netherlands Forensic Institute study. A US National Highway Traffic Safety Administration (NHTSA) report in April 2024 also highlighted “gaps in Tesla’s telematic data” and noted that Autopilot often disengages just before impact, which critics suggest could allow Tesla to avoid responsibility.
Despite repeated attempts to contact Tesla with questions about their data practices and specific incidents, the company has not provided responses. The investigation concludes that Tesla’s handling of crash data remains a “black box,” raising questions about safety on roads shared with over 5 million Tesla vehicles.
Source: The Guardian
Epicyclic
The situation this article describes is unacceptable by any standards. If a company wishes to produce self-driving vehicles or vehicles with certain advanced driver aids, then there must be in place a worldwide legal framework that: a) forces the company to use only open-source software; b) disclose all of the source code within the product (complete with comments) in a form that third parties can modify and re-compile from scratch; c) full disclosure of all relevant data where collisions or other incidents have occurred; and d) the facility for owners to extract and analyse data from their own vehicles using free-of-charge open source tools. Only once such measures are in place can public trust and acceptance of the technology be reasonably expected.