avatarWill Lockett

Summarize

Photo by Vlad Tchompalov on Unsplash

Tesla Is In Trouble

A whistleblower raises further questions about Musk’s handling of Tesla’s self-driving AI.

Lukasz Krupski has had a hell of a year, all thanks to Tesla. Krupski initially loved Tesla; he even sustained severe burns to his hands when, as an employee of Tesla, he single-handedly stopped a fire at a Tesla delivery centre in Oslo, Norway. This heroism didn’t go unnoticed, as Elon tweeted him personally and said, “Congratulations for saving the day!” But the pivotal moment came when Krupski replied to Musk’s tweet, complaining of a lack of safety at the centre, such as no fire extinguishers. Musk welcomed this criticism on Twitter, but Krupski claimed his supervisors at Tesla told him he had no future there because of his comments, while another employee threatened to stab him in the back with a screwdriver over his safety concerns. In 2022, Krupski was fired, having been accused of negatively influencing other staff. Krupski claimed he was fired after raising his safety concerns with the autopilot driver-assistance software. Then, in May of this year, he got his revenge by leaking 100 GB of sensitive Tesla data to the German newspaper Handelsblatt, showing the horrific lack of safety at Tesla, not towards its employees, but to its loyal customers! This corroborates the vast mountain of evidence that Tesla’s self-driving push is deeply, deeply flawed. Both technologically and morally.

The data that Krupski leaked was mainly customer complaints about Tesla’s self-driving software. These complaints revolved around unusual actions the self-driving software took, such as randomly braking in response to non-existent obstacles, known as “phantom braking.” When I test-drove a Tesla a few weeks ago, I actually experienced such an event for myself. Indeed, Krupski has claimed that Tesla employees had spoken to him about vehicles randomly braking in response to non-existent obstacles.

** Quick interruption, if you want more from me, or to interact with me, go follow me on Bluesky, X, or Substack**

Krupski had tried to highlight his concerns internally over this obviously dangerous flaw with the self-driving system, but he was ignored. As such, he felt compelled to share what he had found with data protection authorities, which is how Handelsblatt came to publish them in May.

In a recent interview with the BBC, Krupski said, “I don’t think the hardware is ready and the software is ready.” You might agree with him when we look at the evidence I will show you in a minute. He went on to say that this problem “affects all of us because we are essentially experiments in public roads. So even if you don’t have a Tesla, your children still walk in the footpath.” This is a poignant concern that has been raised by many who have noticed that Tesla uses its paying customers and the unwilling public as beta testers for its potentially fatal software.

These customer complaint issues might not seem like a huge deal at face value. But, when you understand the context of this with the vast amount of other damning evidence against Tesla’s self-driving systems and how Musk has handled it, things look utterly dystopian. I mean, we have government bodies investigating accidents reportedly caused by Tesla’s self-driving systems that have killed drivers and even disembowelled a child. There is even solid evidence that Musk went against his engineering team’s advice and actively made Tesla’s self-driving systems worse to save costs!

Let’s start with the Washington Post report.

You see, back in 2021, Tesla made a controversial decision. It removed all the self-driving sensors from its cars, such as radar and ultrasonic sensors, leaving them with only a camera array to sense the world around them (which Musk calls vision-only). Before, these sensors were a vital part of Tesla’s self-driving systems, as they gave clarity and redundancy to the AI’s ability to understand the world around it. After all, if the cameras were obscured by dirt or could render an image due to poor lighting, the radar and ultrasound could enable the AI to continue to drive the car safely. Moreover, radar and ultrasound data are far easier for the AI to interpret, enabling more accurate and safe driving.

This has kneecapped Tesla’s standings in the autonomous vehicle race. Mercedes and Audi already offer level 3 autonomous driving systems that enable you to (legally) periodically take your attention off the road and hands off the wheel. Tesla is still stuck in level 2, meaning you must pay attention to the road at all times, and legally, they can’t offer a level 3 system yet.

A few months ago, the Washington Post published a report that lifted the lid on what was occurring behind the scenes during this switch to vision-only self-driving period, and what they found was far from ideal. According to the report, Musk overruled a significant number of Tesla engineers who warned him that switching to a visual-only system would be problematic and possibly unsafe due to its high risk of increasing the rate of accidents. His own team knew their systems weren’t up to the task, but Musk believed he knew better than the industry experts who helped propel Tesla to the forefront of autonomous technology and ploughed on with this egocentric, counterproductive plan. He even disabled sensors in older models, so that pretty much the entire Tesla fleet went visual-only.

The report interviewed nearly a dozen former employees, test drivers, safety officials, and other experts, who all reported an increase in crashes, near-misses, and other embarrassing mistakes by Tesla vehicles that were deprived of their critical sensors. The report even found that Musk rushed the release of FSD (Full Self-Driving) before it was ready and that, according to former Tesla employees, even today, the software isn’t safe for public road use. In fact, a former test operator went on record saying that the company is “nowhere close” to having a finished product.

We will come to why Musk made this controversial and deeply morally corrupt decision in a minute. For now, let’s look at the ramifications of this.

The US Department of Justice (DoJ) is currently investigating Tesla for a series of accidents — some fatal — that occurred while their autonomous software was engaged. The dates of many of these accidents transpired after Tesla went visual-only. This isn’t the only reason for these incidents, as in the DoJ’s eyes, Tesla’s marketing and communication departments sold their software as a fully autonomous system, which is far from the truth. As a result, some consumers used it as such, resulting in tragedy.

These accidents were horrific, killing several and severely injuring others, even children. Many of the survivors report anomalous driving from the AI, similar to the ‘phantom breaking’ described by Krupski. Some even report that the self-driving AI straight-up veered off the road and drove them into a bridge at full speed.

This would be horrific on its own, but the fact is that numerous bits of marketing and communications from Musk and Tesla painted out that their self-driving capabilities are fully autonomous and safe to use as such. In the DoJ’s eyes, this is a dangerously false advertisement that has potentially led to the deaths of several people who thought they could entirely rely on the system to drive itself. There is even an argument that it amounts to corporate neglect or even manslaughter.

Tesla itself has even recognised that this investigation could land them in some very hot water. In a quarterly filing to the Securities and Exchange Commission, they confirmed that their automated driving software is the subject of a U.S. Justice Department probe. This is the first time Tesla or Musk has even acknowledged this investigation, which has been going on for over a year now. In the filing, Tesla stated, “The company has received requests for information, including subpoenas, from the DoJ. These have included requests for documents related to Tesla’s Autopilot and FSD features.” Now, Tesla did not disclose what data it sent to the DoJ, but did state that “Should the government decide to pursue an enforcement action, there exists the possibility of a material adverse impact on our business.”

That last part should tell us what is at stake here for Tesla and why Musk has handled it so poorly.

You see, Musk has gone on record saying that their self-driving AI is why Tesla is valued so highly. Musk not only needs Tesla to be valued so highly to keep Tesla afloat and growing, but also to fund his many other side projects, such as SpaceX and Twitter, by using his Tesla stock as collateral on billion-dollar loans. The other major reason for Tesla’s utterly massive valuation is its profit margin, which peaked at around 30%, well over twice the auto-industry standard, in Q4 of 2021 (right after he removed the sensors).

Now, I can’t be sure of the exact motivations for Musk and Tesla’s actions here, but there is a reasonably solid trail of circumstantial evidence here. You see, in 2021, many of Tesla’s rivals started to not only come close but even exceed Tesla in the price-to-performance ratio EV race (such as the Hyundai Ioniq 5). As such, a price war was looming on the horizon. But Musk couldn’t risk sacrificing his profit margins too much, as it would tank valuations. So, he made Tesla’s self-driving systems vision-only, as it would increase their margins by thousands per vehicle by dramatically reducing the number of sensors, even though he knew it would significantly worsen Tesla’s self-driving ability. This allowed Tesla to drop their prices and remain super competitive while protecting their profit margin and company’s valuation. However, Musk couldn’t let on that the system’s ability and safety had been compromised. As such, he and Tesla doubled down on their “fully autonomous” rhetoric, hoping this blatant propaganda would keep the perceived market value of their self-driving systems inflated and, in turn, the company valuation inflated as well. This led to people misusing the now even more dangerous system (after it went vision-only), leading to a series of deadly accidents that sparked the DoJ investigation, which Musk is trying to keep hush-hush.

I mean, the fact that Musk and Tesla have only recognised the DoJ investigation in the small print of a filing to the Securities and Exchange Commission shows they want to utterly bury this scandal but also play it safe. After all, if the DoJ charges Musk and Tesla (which is still very much a possibility), then the perceived valuation of their self-driving system tanks, and so does the company valuation. Investors will be furious that Tesla didn’t inform them of the probe and could sue them. But thanks to this filing, Tesla and Musk are safe from such legal attacks should the worst happen.

I must say, it is highly hypocritical that Musk and Tesla would take underhanded and hidden legal action (Securities and Exchange Commission filing) to keep Tesla safe from well-founded lawsuits. Meanwhile, they are happy to compromise the safety of not only their customers but also the general public, all seemingly to keep the company valuation inflated. We have to accept that the capitalist-heavy society we have built has enabled Musk’s egotistical and Machiavellian nature to rule and create this morally corrupt situation. I love the fact that Tesla was able to kick-start the EV revolution and challenge the auto-industry planet-wrecking status quo, and I wholeheartedly think we should accept and embrace that side of capitalism and free markets. But we should also have laws and systems in place to protect ourselves from this dangerously insidious side of it too.

Thanks for reading! Content like this doesn’t happen without your support. If you want to support content like this, or read articles early, go and follow me and my project Planet Earth & Beyond or follow me on Bluesky or X.

(Originally published on PlanetEarthAndBeyond.co)

Sources: The Independent, Will Lockett, Will Lockett, Futurism, BBC, Planet Earth & Beyond, Carscoops, IBD

Technology
Tech
Sustainability
Climate Change
Environment
Recommended from ReadMedium