Beta Tested on the Dead
Tesla’s autopilot lies, the courtroom reckoning, and Elon Musk’s decade-long FSD fraud in the fast lane
Let’s say the quiet part out loud: Tesla didn’t just screw up; the company covered up, lied, obstructed, and then got caught.
This week, in a courtroom that Elon Musk couldn’t buy, bully, or bullshit into submission, a jury unanimously found Tesla liable in a wrongful death case involving Autopilot. And not just liable in the hand-waving, legalese, “well, everyone’s a little at fault” sort of way. No, they found Tesla had 33% of the blood on its hands. A full one-third. The other two-thirds? Sure, the driver admitted to being distracted. But the jury didn’t buy Tesla’s desperate PR fantasy that Autopilot was just a helpful little assistant unfairly blamed by ungrateful humans. They saw the receipts, and oh, what a pile of receipts it was.
Turns out, Tesla had the crash data within three minutes of the fatal impact. Not just a log or a timestamp. The whole thing: video, CAN-bus logs, sensor inputs, Autopilot decision-making data, everything. The car sent it to the “Mothership,” Tesla’s internal server farm, and promptly deleted the local copy. And then? Tesla pretended it didn’t exist.
They told the cops there was no such data. They told the family there was no such data. They wrote it in their legal filings. They lied to everyone, repeatedly, for years.
And when the Florida Highway Patrol homicide investigator asked Tesla for help, their lawyer scripted his evidence request letter to deliberately exclude the data Tesla already had. In return, the company coughed up infotainment logs and a user manual. No Autopilot data. Nothing that could actually explain why a daughter died behind the wheel.
Then came the theater. Tesla lured the investigator to a service center in Coral Gables with the car’s onboard computers in hand only to claim the data was “corrupted.” Their technician testified he never powered up the Autopilot ECU. But forensic engineers later proved it was powered up, accessed, and the data was very much alive. Tesla just didn’t want to share it.
They even invented an “auto-delete” feature that doesn’t exist to explain away why the data couldn’t be recovered. Which is like burning down the evidence locker and blaming the janitor.
Finally, in 2025, six years and multiple legal beatdowns later, the plaintiffs’ forensic experts obtained a bit-for-bit clone of the Autopilot ECU. And lo and behold, there it was: the full snapshot, intact with file names, checksums, server paths. Logs showing Tesla received and stored the file just three minutes after the crash. The same file they swore didn’t exist. The same file they used for their own internal crash analysis.
And what did the forbidden data show?
Autopilot was active.
Autosteer was in control.
No warning was issued despite a T-intersection and a stationary vehicle ahead.
The system knew it was in a “restricted” Autosteer zone and let Autopilot keep going, anyway.
All while Tesla claimed the system was only intended for highway use and totally not at fault when someone died using it off-highway. Never mind that the car knew it wasn’t supposed to be in Autopilot there. Never mind that the NTSB literally warned Tesla to geofence these features years ago. Tesla decided it was more important to avoid bad press than prevent a preventable death.
Let’s be crystal clear: this wasn’t about whether Autopilot caused the crash. This was about whether Tesla made it worse. Whether it allowed misuse it knew would happen. Whether it withheld critical information from police and grieving families. The jury said yes.
So to the Tesla fans foaming at the mouth that this is all “unfair”, sit down and shut up. The driver admitted fault. He settled early. This case was about Tesla’s responsibility. And the jury, you know, the people who actually saw all the evidence, said Tesla’s share of blame was real.
And that’s what should scare the hell out of the company. Because now the precedent is set. Autopilot misuse is no longer just the driver’s problem. It’s Tesla’s too. Every time they fail to geofence. Every time they mislead the public with over-hyped safety claims. Every time they act like “hands on wheel” is a sufficient fail-safe while ignoring their own internal maps.
This is what happens when you build your stock price on science fiction. For over a decade, Elon Musk has lied through his capped teeth about Tesla’s Full Self-Driving capabilities, promising coast-to-coast autonomous demos that never happen, claiming millions of robotaxis would be on the road by now, and selling “FSD” packages to consumers that function more like expensive beta tests than actual autonomy. His latest Robotaxi rollout? Riddled with system faults, phantom braking, map failures, and a user interface so buggy it makes Windows Vista look like the Apollo Guidance Computer. But the hype keeps the stock afloat. And now, people are dying while Musk livestreams giggles from the front seat of a car that doesn’t know how to stop at an intersection.
It’s dieselgate with a body count, and a $243 million price tag. That’s what 33% of the blame looks like when Tesla gets caught misleading the world about a system that cost a life and withheld the evidence.
You want autonomous driving to be taken seriously? Start by telling the truth. Until then, Tesla is just an overhyped software company doing vehicular manslaughter in beta, and now finally, paying for it.
Full disclosure: I own a Tesla Model 3. Musk may have sullied the brand’s image, but that doesn’t mean it’s not a great car. I ran Full Self-Driving (supervised) for a year — amazing, but not perfect. It watches you constantly, flashes warnings, and updates often. Someday it could be nearly flawless and save millions of lives. I dropped it a few months ago — not because it’s bad, just not that useful unless you’re doing long freeway trips
Tesla’s alleged crash-data suppression isn’t just bad optics — it’s a hole in the very feedback loop that makes dangerous systems safer. Aviation works because every mishap gets the black-box treatment, findings go public, fixes get made. If Tesla really hid or massaged Autopilot crash info, that’s like an airline losing the black box to protect the stock price. You can’t fix what you pretend never happened, and you can’t expect trust if you hide the data.
Now, a smaller but relatable case: a Substack “author” who delivers a polished, thousand-word political column before most people have coffee — same beats every day, perfect paragraphs, evenly spaced metaphors. Most couldn’t keep that up without a newsroom… or something else. And here’s where it rhymes with Tesla: it’s not the tool that’s the problem — it’s the non-disclosure. Tesla allegedly hid the data; the writer hides the method. Maybe the writer is just that good — or maybe there’s a robot ghostwriter in the back room with perfect pacing and paired with an equally active billing system.
Just my 2¢