‘The automobile all of the sudden accelerated with our child in it’: the terrifying fact about why Tesla’s automobiles maintain crashing | Tesla

on

|

views

and

comments


It was a Monday afternoon in June 2023 when Rita Meier, 45, joined us for a video name. Meier advised us in regards to the final time she stated goodbye to her husband, Stefan, 5 years earlier. He had been leaving their residence close to Lake Constance, Germany, heading for a commerce honest in Milan.

Meier recalled how he hesitated between taking his Tesla Mannequin S or her BMW. He had by no means pushed the Tesla that far earlier than. He checked the route for charging stations alongside the best way and finally determined to strive it. Rita had a nasty feeling. She stayed residence with their three youngsters, the youngest lower than a yr previous.

At 3.18pm on 10 Might 2018, Stefan Meier misplaced management of his Mannequin S on the A2 freeway close to the Monte Ceneri tunnel. Travelling at about 100kmh (62mph), he ploughed by a number of warning markers and site visitors indicators earlier than crashing right into a slanted guardrail. “The collision with the guardrail launches the automobile into the air, the place it flips a number of instances earlier than touchdown,” investigators would write later.

The automotive got here to relaxation greater than 70 metres away, on the other aspect of the street, leaving a path of wreckage. In keeping with witnesses, the Mannequin S burst into flames whereas nonetheless airborne. A number of passersby tried to open the doorways and rescue the motive force, however they couldn’t unlock the automotive. Once they heard explosions and noticed flames by the home windows, they retreated. Even the firefighters, who arrived 20 minutes later, may do nothing however watch the Tesla burn.

At that second, Rita Meier was unaware of the crash. She tried calling her husband, however he didn’t choose up. When he nonetheless hadn’t returned her name hours later – extremely uncommon for this devoted father – she tried to trace his automotive utilizing Tesla’s app. It not labored. By the point cops rang her doorbell late that evening, Meier was already bracing for the worst.

The crash made headlines the following morning as one of many first deadly Tesla accidents in Europe. Tesla launched an announcement to the press saying the corporate was “deeply saddened” by the incident, including, “We’re working to assemble all of the details on this case and are totally cooperating with native authorities.”

To at the present time, Meier nonetheless doesn’t know why her husband died. She has saved all the pieces the police gave her after their inconclusive investigation. The charred wreck of the Mannequin S sits in a storage Meier rents particularly for that goal. The scorched telephone – which she had forensically analysed at her personal expense, to no avail – sits in a drawer at residence. Perhaps sometime all this might be wanted once more, she says. She hasn’t given up hope of uncovering the reality.


Rita Meier was certainly one of many individuals who reached out to us after we started reporting on the Tesla Information – a cache of 23,000 leaked paperwork and 100 gigabytes of confidential knowledge shared by an nameless whistleblower. The primary report we printed checked out issues with Tesla’s autopilot system, which permits the automobiles to quickly drive on their very own, taking on steering, braking and acceleration. Although touted by the corporate as “Full Self-Driving” (FSD), it’s designed to help, not substitute, the motive force, who ought to maintain their eyes on the street and be able to intervene at any time.

Autonomous driving is the core promise round which Elon Musk has constructed his firm. Tesla has by no means delivered a really self-driving automobile, but the richest particular person on this planet retains repeating the declare that his automobiles will quickly drive totally with out human assist. Is Tesla’s autopilot actually as superior as he says?

The Tesla Information recommend in any other case. They comprise greater than 2,400 buyer complaints about unintended acceleration and greater than 1,500 braking points – 139 involving emergency braking with out trigger, and 383 phantom braking occasions triggered by false collision warnings. Greater than 1,000 crashes are documented. A separate spreadsheet on driver-assistance incidents the place clients raised security considerations lists greater than 3,000 entries. The oldest date from 2015, the latest from March 2022. In that point, Tesla delivered roughly 2.6m autos with autopilot software program. Most incidents occurred within the US, however there have additionally been complaints from Europe and Asia. Prospects described their automobiles all of the sudden accelerating or braking arduous. Some escaped with a scare; others ended up in ditches, crashing into partitions or colliding with oncoming autos. “After dropping my son off in his college parking zone, as I am going to make a right-hand exit it lurches ahead all of the sudden,” one grievance learn. One other stated, “My autopilot failed/malfunctioned this morning (automotive didn’t brake) and I nearly rear-ended any person at 65mph.” A 3rd reported, “Right this moment, whereas my spouse was driving with our child within the automotive, it all of the sudden accelerated out of nowhere.”

Braking for no cause prompted simply as a lot misery. “Our automotive simply stopped on the freeway. That was terrifying,” a Tesla driver wrote. One other complained, “Frequent phantom braking on two-lane highways. Makes the autopilot nearly unusable.” Some report their automotive “jumped lanes unexpectedly”, inflicting them to hit a concrete barrier, or veered into oncoming site visitors.

Musk has given the world many causes to criticise him since he teamed up with Donald Trump. Many individuals do – largely by boycotting his merchandise. However whereas it’s one factor to disagree with the political opinions of a enterprise chief, it’s one other to be mortally afraid of his merchandise. Within the Tesla Information, we discovered hundreds of examples of why such worry could also be justified.

‘My husband died in an unexplained accident. And nobody cared.’ Illustration: Carl Godfrey/The Guardian

We got down to match a few of these incidents of autopilot errors with clients’ names. Like a whole lot of different Tesla clients, Rita Meier entered the automobile identification variety of her husband’s Mannequin S into the response type we printed on the web site of the German enterprise newspaper Handelsblatt, for which we carried out our investigation. She rapidly found that the Tesla Information contained knowledge associated to the automotive. In her first electronic mail to us, she wrote, “You may in all probability think about what it felt prefer to learn that.”

There isn’t a lot data – simply an Excel spreadsheet titled “Incident Assessment”. A Tesla worker famous that the mileage counter on Stefan Meier’s automotive stood at 4,765 miles on the time of the crash. The entry was catalogued simply at some point after the deadly accident. Within the remark subject was written, “Automobile concerned in an accident.” The reason for the crash stays unknown to at the present time. In Tesla’s inside system, an organization worker had marked the case as “resolved”, however for 5 years, Rita Meier had been looking for solutions. After Stefan’s dying, she took over the household enterprise – a timber firm with 200 staff based mostly in Tettnang, Baden-Württemberg. As journalists, we’re used to powerful interviews, however this one was totally different. We needed to strike a cautious stability – between empathy and the persistent questioning good reporting calls for. “Why are you satisfied the Tesla was answerable for your husband’s dying?” we requested her. “Isn’t it attainable he was distracted – perhaps taking a look at his telephone?”

Nobody is aware of for certain. However Meier was effectively conscious that Musk has beforehand claimed Tesla “releases crucial crash knowledge affecting public security instantly and at all times will”; that he has bragged many instances about how its superior dealing with of information units the corporate aside from its opponents. Within the case of her husband, why was she anticipated to consider there was no knowledge?

Meier’s account was structured and exact. Solely as soon as did the toll change into seen – when she described how her husband’s physique burned in full view of the firefighters. Her eyes crammed with tears and her voice cracked. She apologised, turning away. After she collected herself, she advised us she has nothing left to realize – but additionally nothing to lose. That was why she had reached out to us. We promised to look into the case.


Rita Meier wasn’t the one widow to method us. Upset clients, present and former staff, analysts and legal professionals had been sharing hyperlinks to our reporting. Lots of them contacted us. Greater than as soon as, somebody wrote that it was about time somebody stood as much as Tesla – and to Elon Musk.

Meier, too, shared our articles and the callout type with others in her community – together with individuals who, like her, misplaced family members in Tesla crashes. One in all them was Anke Schuster. Like Meier, she had misplaced her husband in a Tesla crash that defies clarification and had spent years chasing solutions. And, like Meier, she had discovered her husband’s Mannequin X listed within the Tesla Information. As soon as once more, the incident was marked as resolved – with no indication of what that truly meant.

“My husband died in an unexplained and inexplicable accident,” Schuster wrote in her first electronic mail. Her dealings with police, prosecutors and insurance coverage corporations, she stated, had been “hell”. Nobody appeared to know how a Tesla works. “I misplaced my husband. His 4 daughters misplaced their father. And nobody ever cared.”

Her husband, Oliver, was a tech fanatic, fascinated by Musk. A hotelier by commerce, he owned no fewer than 4 Teslas. He beloved the automobiles. She hated them – particularly the autopilot. The best way the software program appeared to make selections by itself by no means sat proper along with her. Now, she felt as if her instincts had been confirmed within the worst approach.

Oliver Schuster was coming back from a enterprise assembly on 13 April 2021 when his black Mannequin X veered off freeway B194 between Loitz and Schönbeck in north-east Germany. It was 12.50pm when the automotive left the street and crashed right into a tree. Schuster began to fret when her husband missed a scheduled financial institution appointment. She tried to trace the automobile however discovered no method to find it. Even calling Tesla led nowhere. That night, the police broke the information: after the crash her husband’s automotive had burst into flames. He had burned to dying – with the fireplace brigade watching helplessly.

The crashes that killed Meier’s and Schuster’s husbands had been nearly three years aside however the parallels had been chilling. We examined accident experiences, eyewitness accounts, crash-site pictures and correspondence with Tesla. In each circumstances, investigators had requested automobile knowledge from Tesla, and the corporate hadn’t offered it. In Meier’s case, Tesla workers claimed no knowledge was obtainable. In Schuster’s, they stated there was no related knowledge.

Over the following two years, we spoke with crash victims, grieving households and consultants around the globe. What we uncovered was an ominous black field – a system designed not solely to gather and management each byte of buyer knowledge, however to safeguard Musk’s imaginative and prescient of autonomous driving. Crucial data was sealed off from public scrutiny.


Elon Musk is a perfectionist with an inclination in the direction of micromanagement. At Tesla, his whims appear to override each argument – even in issues of life and dying. Throughout our reporting, we got here throughout the difficulty of door handles. On Teslas, they retract into the doorways whereas the automobiles are being pushed. The system depends upon battery energy. If an airbag deploys, the doorways are presupposed to unlock mechanically and the handles prolong – no less than, that’s what the Mannequin S handbook says.

The concept for the modern, futuristic design stems from Musk himself. He insisted on retractable handles, regardless of repeated warnings from engineers. Since 2018, they’ve been linked to no less than 4 deadly accidents in Europe and the US, through which 5 folks died.

In February 2024, we reported on a very tragic case: a deadly crash on a rustic street close to Dobbrikow, in Brandenburg, Germany. Two 18-year-olds had been killed when the Tesla they had been in slammed right into a tree and caught fireplace. First responders couldn’t open the doorways as a result of the handles had been retracted. The youngsters burned to dying within the again seat.

A court-appointed knowledgeable from Dekra, certainly one of Germany’s main testing authorities, later concluded that, given the retracted handles, the incident “qualifies as a malfunction”. In keeping with the report, “the failure of the rear door handles to increase mechanically have to be thought-about a decisive issue” within the deaths. Had the system labored as meant, “it’s assumed that rescuers may need been in a position to extract the 2 backseat passengers earlier than the fireplace developed additional”. With out what the report calls a “failure of this security operate”, the kids may need survived.

‘I really feel like I am within the motion pictures’: malfunctioning robotaxi traps passenger in automotive – video

Our investigation made waves. The Kraftfahrt-Bundesamt, Germany’s federal motor transport authority, received concerned and introduced plans to coordinate with different regulatory our bodies to revise worldwide security requirements. Germany’s largest car membership, ADAC, issued a public suggestion that Tesla drivers ought to carry emergency window hammers. In an announcement, ADAC warned that retractable door handles may critically hinder rescue efforts. Even skilled emergency responders, it stated, could battle to achieve trapped passengers. Tesla reveals no intention of adjusting the design.

That’s Musk. He prefers the modern look of Teslas with out handles, so he accepts the chance to his clients. His considering, it appears, goes one thing like this: in some unspecified time in the future, the engineers will work out a technical repair. The identical logic applies to his grander imaginative and prescient of autonomous driving: as a result of Musk needs to be first, he lets clients take a look at his unfinished Autopilot system on public roads. It’s a precept borrowed from the software program world, the place releasing apps in beta has lengthy been customary apply. The extra customers, the extra suggestions and, over time – typically years – one thing secure emerges. Income and market share arrive a lot earlier. The motto: for those who wait, you lose.

Musk has taken that mindset to the street. The world is his lab. Everybody else is a part of the experiment.


By the top of 2023, we knew quite a bit about how Musk’s automobiles labored – however the best way they deal with knowledge nonetheless felt like a black field. How is that knowledge saved? At what second does the onboard laptop ship it to Tesla’s servers? We talked to impartial consultants on the Technical College Berlin. Three PhD candidates – Christian Werling, Niclas Kühnapfel and Hans Niklas Jacob – made headlines for hacking Tesla’s autopilot {hardware}. A short voltage drop on a circuit board turned out to be simply sufficient to trick the system into opening up.

The safety researchers uncovered what they known as “Elon Mode” – a hidden setting through which the automotive drives totally autonomously, with out requiring the motive force to maintain his fingers on the wheel. Additionally they managed to recuperate deleted knowledge, together with video footage recorded by a Tesla driver. And so they traced precisely what knowledge Tesla sends to its servers – and what it doesn’t.

The hackers defined that Tesla shops knowledge in three locations. First, on a reminiscence card contained in the onboard laptop – basically a operating log of the automobile’s digital mind. Second, on the occasion knowledge recorder – a black field that captures a couple of seconds earlier than and after a crash. And third, on Tesla’s servers, assuming the automobile uploads them.

The researchers advised us they’d discovered an inside database embedded within the system – one constructed round so-called set off occasions. If, for instance, the airbag deploys or the automotive hits an impediment, the system is designed to save lots of an outlined set of information to the black field – and transmit it to Tesla’s servers. Except the autos had been in a whole community useless zone, in each the Meier and Schuster circumstances, the automobiles ought to have recorded and transmitted that knowledge.

‘Is the automotive driving erratically by itself regular? Yeah, that occurs from time to time.’ Illustration: Carl Godfrey/The Guardian

Who within the firm truly works with that knowledge? We examined testimony from Tesla staff in court docket circumstances associated to deadly crashes. They described how their departments function. We cross-referenced their statements with entries within the Tesla Information. A sample took form: one group screens all crashes at a excessive stage, forwarding them to specialists – some targeted on autopilot, others on automobile dynamics or street grip. There’s additionally a bunch that steps in each time authorities request crash knowledge.

skip previous e-newsletter promotion

We compiled a listing of staff related to our reporting. Some we tried to achieve by electronic mail or telephone. For others, we confirmed up at their properties. In the event that they weren’t there, we left handwritten notes. Nobody needed to speak.

We looked for different crashes. One concerned Hans von Ohain, a 33-year-old Tesla worker from Evergreen, Colorado. On 16 Might 2022, he crashed right into a tree on his approach residence from a golf outing and the automotive burst into flames. Von Ohain died on the scene. His passenger survived and advised police that von Ohain, who had been consuming, had activated Full Self-Driving. Tesla, nevertheless, stated it couldn’t affirm whether or not the system was engaged – as a result of no automobile knowledge was transmitted for the incident.

Then, in February 2024, Musk himself stepped in. The Tesla CEO claimed von Ohain had by no means downloaded the newest model of the software program – so it couldn’t have prompted the crash. Mates of von Ohain, nevertheless, advised US media he had proven them the system. His passenger that day, who barely escaped along with his life, advised reporters that hours earlier the automotive had already pushed erratically by itself. “The primary time it occurred, I used to be like, ‘Is that ordinary?’” he recalled asking von Ohain. “And he was like, ‘Yeah, that occurs from time to time.’”

His account was bolstered by von Ohain’s widow, who defined to the media how overjoyed her husband had been at working for Tesla. Reportedly, von Ohain acquired the Full Self-Driving system as a perk. His widow defined how he would use the system nearly each time he received behind the wheel: “It was jerky, however we had been like, that comes with the territory of recent expertise. We knew the expertise needed to be taught, and we had been prepared to be a part of that.”

The Colorado State Patrol investigated however closed the case with out blaming Tesla. It reported that no usable knowledge was recovered.


For an organization that markets its automobiles as computer systems on wheels, Tesla’s declare that it had no knowledge obtainable in all these circumstances is shocking. Musk has lengthy described Tesla autos as a part of a collective neural community – machines that repeatedly be taught from each other. Consider the Borg aliens from the Star Trek franchise. Musk envisions his automobiles, just like the Borg, as a collective – working as a hive thoughts, every automobile linked to a unified consciousness.

When a journalist requested him in October 2015 what made Tesla’s driver-assistance system totally different, he replied, “The entire Tesla fleet operates as a community. When one automotive learns one thing, all of them be taught it. That’s past what different automotive corporations are doing.” Each Tesla driver, he defined, turns into a type of “knowledgeable coach for the way the autopilot ought to work”.

In keeping with Musk, the eight cameras in each Tesla transmit greater than 160bn video frames a day to the corporate’s servers. In its proprietor’s handbook, Tesla states that its automobiles could acquire much more: “analytics, street phase, diagnostic and automobile utilization knowledge”, all despatched to headquarters to enhance product high quality and options reminiscent of autopilot. The corporate claims it learns “from the expertise of billions of miles that Tesla autos have pushed”.

It’s a highly effective promise: a fleet of thousands and thousands of automobiles, always feeding uncooked data right into a gargantuan processing centre. Billions – trillions – of information factors, all in service of 1 aim: making automobiles drive higher and holding drivers secure. Initially of this yr, Musk received an opportunity to point out the world what he meant.

On 1 January 2025, at 8.39am, a Tesla Cybertruck exploded outdoors the Trump Worldwide Resort Las Vegas. The person behind the incident – US particular forces veteran Matthew Livelsberger – had rented the automobile, packed it with fireworks, fuel canisters and grenades, and parked it in entrance of the constructing. Simply earlier than the explosion, he shot himself within the head with a .50 calibre Desert Eagle pistol. “This was not a terrorist assault, it was a wakeup name. People solely take note of spectacles and violence,” Livelsberger wrote in a letter later discovered by authorities. “What higher method to get my level throughout than a stunt with fireworks and explosives.”

The soldier miscalculated. Seven bystanders suffered minor accidents. The Cybertruck was destroyed, however not even the home windows of the resort shattered. As a substitute, along with his last act, Livelsberger revealed one thing else totally: simply how far the arm of Tesla’s knowledge equipment can attain. “The entire Tesla senior group is investigating this matter proper now,” Musk wrote on X simply hours after the blast. “Will put up extra data as quickly as we be taught something. We’ve by no means seen something like this.”

Later that day, Musk posted once more. Tesla had already analysed all related knowledge – and was prepared to supply conclusions. “Now we have now confirmed that the explosion was attributable to very giant fireworks and/or a bomb carried within the mattress of the rented Cybertruck and is unrelated to the automobile itself,” he wrote. “All automobile telemetry was constructive on the time of the explosion.”

All of the sudden, Musk wasn’t only a CEO; he was an investigator. He instructed Tesla technicians to remotely unlock the scorched automobile. He handed over inside footage captured up to date of detonation.The Tesla CEO had turned a suicide assault right into a showcase of his superior expertise.

But there have been critics even within the second of glory. “It reveals the type of sweeping surveillance occurring,” warned David Choffnes, government director of the Cybersecurity and Privateness Institute at Northeastern College in Boston, when contacted by a reporter. “When one thing dangerous occurs, it’s useful, but it surely’s a double-edged sword. Corporations that acquire this knowledge can abuse it.”

‘In lots of crashes, investigators weren’t even conscious that requesting knowledge from Tesla was an choice.’ Illustration: Carl Godfrey/The Guardian

There are different examples of what Tesla’s knowledge assortment makes attainable. We discovered the case of David and Sheila Brown, who died in August 2020 when their Mannequin 3 ran a purple gentle at 114mph in Saratoga, California. Investigators managed to reconstruct each element, because of Tesla’s automobile knowledge. It reveals precisely when the Browns opened a door, unfastened a seatbelt, and the way arduous the motive force pressed the accelerator – all the way down to the millisecond, proper up to date of affect. Over time, we discovered extra circumstances, extra detailed accident experiences. The info undoubtedly is there – till it isn’t.

In lots of crashes when Teslas inexplicably veered off the street or hit stationary objects, investigators didn’t truly request knowledge from the corporate. After we requested authorities why, there was typically silence. Our impression was that many prosecutors and cops weren’t even conscious that asking was an choice. In different circumstances, they acted solely when pushed by victims’ households.

Within the Meier case, Tesla advised authorities, in a letter dated 25 June 2018, that the final full set of auto knowledge was transmitted almost two weeks earlier than the crash. The one knowledge from the day of the accident was a “restricted snapshot of auto parameters” – taken “roughly 50 minutes earlier than the incident”. Nonetheless, this snapshot “doesn’t present something in relation to the incident”. As for the black field, Tesla warned that the storage modules had been possible destroyed, given the situation of the burned-out automobile. Knowledge transmission after a crash is feasible, the corporate stated – however on this case, it didn’t occur. Ultimately, investigators couldn’t even decide whether or not driver-assist methods had been energetic on the time of the crash.

The Schuster case performed out equally. Prosecutors in Stralsund, Germany, had been baffled. The street the place the crash occurred is straight, the asphalt was dry and the climate on the time of the accident was clear. Anke Schuster saved urging the authorities to look at Tesla’s telemetry knowledge.

When prosecutors did formally request the info recorded by Schuster’s automotive on the day of the crash, it took Tesla greater than two weeks to reply – and when it did, the reply was each transient and daring. The corporate didn’t say there was no knowledge. It stated that there was “no related knowledge”. The authorities’ response left us shocked. We anticipated prosecutors to push again – to inform Tesla that deciding what’s related is their job, not the corporate’s. However they didn’t. As a substitute, they closed the case.

The hackers from TU Berlin pointed us to a research by the Netherlands Forensic Institute, an impartial division of the ministry of justice and safety. In October 2021, the NFI printed findings exhibiting it had efficiently accessed the onboard recollections of all main Tesla fashions. The researchers in contrast their outcomes with accident circumstances through which police had requested knowledge from Tesla. Their conclusion was that whereas Tesla formally complied with these requests, it omitted giant volumes of information which may have proved helpful.

Tesla’s credibility took an additional hit in a report launched by the US Nationwide Freeway Visitors Security Administration in April 2024. The company concluded that Tesla did not adequately monitor whether or not drivers stay alert and able to intervene whereas utilizing its driver-assist methods. It reviewed 956 crashes, subject knowledge and buyer communications, and pointed to “gaps in Tesla’s telematic knowledge” that made it unimaginable to find out how typically autopilot was energetic throughout crashes. If a automobile’s antenna was broken or it crashed in an space with out community protection, even severe accidents typically went unreported. Tesla’s inside statistics embrace solely these crashes through which an airbag or different pyrotechnic system deployed – one thing that happens in simply 18% of police-reported circumstances. Which means the precise accident fee is considerably greater than Tesla discloses to clients and traders.

There’s extra. Two years prior, the NHTSA had flagged one thing unusual – one thing suspicious. In a separate report, it documented 16 circumstances through which Tesla autos crashed into stationary emergency autos. In every, autopilot disengaged “lower than one second earlier than affect” – far too little time for the motive force to react. Critics warn that this behaviour may enable Tesla to argue in court docket that autopilot was not energetic for the time being of affect, doubtlessly dodging accountability.

The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a variety of hazardous conditions, through which the Mannequin Y carried out considerably worse than a competing automobile. The Tesla repeatedly ran over a crash-test dummy with out braking. The video went viral, amassing greater than 14m views inside a couple of days.

Mark Rober’s Tesa take a look at drive

The actual shock got here after the experiment. Fred Lambert, who writes for the weblog Electrek, identified the identical autopilot disengagement that the NHTSA had documented. “Autopilot seems to mechanically disengage a fraction of a second earlier than the affect because the crash turns into inevitable,” Lambert famous.

And so the doubts about Tesla’s integrity pile up. Within the Tesla Information, we discovered emails and experiences from a UK-based engineer who led Tesla’s Security Incident Investigation programme, overseeing the corporate’s most delicate crash circumstances. His inside memos reveal that Tesla intentionally restricted documentation of explicit points to keep away from the chance of this data being requested beneath subpoena. Though he pushed for clearer protocols and higher inside processes, US management resisted – explicitly pushed by fears of authorized publicity.

We contacted Tesla a number of instances with questions in regards to the firm’s knowledge practices. We requested in regards to the Meier and Schuster circumstances – and what it means when deadly crashes are marked “resolved” in Tesla’s inside system. We requested the corporate to answer criticism from the US site visitors authority and to the findings of Dutch forensic investigators. We additionally requested why Tesla doesn’t merely publish crash knowledge, as Musk as soon as promised to do, and whether or not the corporate considers it applicable to withhold data from potential US court docket orders. Tesla has not responded to any of our questions.

Elon Musk boasts in regards to the huge quantity of information his automobiles generate – knowledge that, he claims, won’t solely enhance Tesla’s total fleet but additionally revolutionise street site visitors. However, as we’ve witnessed repeatedly in probably the most crucial of circumstances, Tesla refuses to share it.

Tesla’s dealing with of crash knowledge impacts even those that by no means needed something to do with the corporate. Each street consumer trusts the automotive in entrance, behind or beside them to not be a menace. Does that belief nonetheless stand when the automotive is driving itself?

Internally, we known as our investigation into Tesla’s crash knowledge Black Field. At first, as a result of it handled the bodily knowledge models constructed into the autos – so-called black bins. However the units Tesla installs hardly deserve the identify. Not like the flight recorders utilized in aviation, they’re not fireproof – and in lots of the circumstances we examined, they proved ineffective.

Over time, we got here to see that the identify held a second which means. A black field, in frequent parlance, is one thing closed to the surface. One thing opaque. Unknowable. And whereas we’ve gained some perception into Tesla as an organization, its dealing with of crash knowledge stays simply that: a black field. Solely Tesla is aware of how Elon Musk’s autos really work. But immediately, greater than 5m of them share our roads.

Some names have been modified.

That is an edited extract from The Tesla Information by Sönke Iwersen and Michael Verfürden, printed on 24 July by Penguin Michael Joseph at £22. To help the Guardian, order your copy at guardianbookshop.com. Supply expenses could apply.

Share this
Tags

Must-read

Tesla car deliveries drop sharply as Musk backlash impacts demand | Tesla

Tesla posted one other massive drop in quarterly deliveries on Wednesday, placing it on track for its second straight annual gross sales decline...

‘Lidar is lame’: why Elon Musk’s imaginative and prescient for a self-driving Tesla taxi faltered | Tesla

After years of promising traders that thousands and thousands of Tesla robotaxis would quickly fill the streets, Elon Musk debuted his driverless automobile...

Common Motors names new CEO of troubled self-driving subsidiary Cruise | GM

Common Motors on Tuesday named a veteran know-how government with roots within the online game business to steer its troubled robotaxi service Cruise...

Recent articles

More like this

LEAVE A REPLY

Please enter your comment!
Please enter your name here