Scrapping Tesla Self-Driving Crash Reporting Rules Gives Elon Musk the Licence to Kill

surveillance footage of tesla crash on the bay bridge

By Ben Emos & Tony Bruce | Tuesday, December 17, 2024 | 7 min read

On Thanksgiving Day in 2022, many families across North America were sitting down to dinner, laughing over turkey and pie. Meanwhile, something unexpected was happening to their Teslas out in the driveway. Without much warning, Tesla’s CEO made a bold announcement: Teslas equipped with the “Full Self-Driving” option could now drive themselves.

In a celebratory post, Elon Musk shared, “Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from their car screen, assuming you have bought this option. Congratulations to the Tesla autopilot team on achieving a major milestone.”

It felt like something straight out of a sci-fi movie—the future, finally here. Cars that no longer needed humans behind the wheel. With a simple over-the-air software update, Tesla owners found their vehicles transformed as they enjoyed their holiday meals. Suddenly, their cars could navigate roads autonomously. For Tesla enthusiasts, it was a thrilling moment, a glimpse of a long-dreamed technological breakthrough.

But beneath the excitement lay a much deeper, unsettling reality: Elon Musk now held unprecedented control over the lives of these drivers. With the ability to remotely enable or disable the autopilot system, Musk essentially became the gatekeeper of their safety—raising the question: can one man be trusted with such a monumental responsibility?

Recent events suggest caution. Musk’s decision to shut off his Starlink satellite network during a critical moment in Ukraine’s war, reportedly after striking a private deal with Vladimir Putin, left lives hanging in the balance. If such a controversial and morally fraught decision could be made in the context of a warzone, what’s to stop similar decisions from affecting Tesla drivers—decisions about their lives and safety, made by someone they’ve never met?

For those who see Elon Musk as a visionary, these concerns might feel overblown. But when the power to decide whether people live or die rests with one individual—especially someone known for erratic decisions and an obsession with boosting Tesla’s stock using every trick in the book—it forces us to confront a critical question: Can we really trust him? Would you entrust your life to someone with such a track record?

Buy Now: Digital Hearing Aid
Buy Now: Digital Hearing Aid Severe Loss Rechargeable Invisible BTE Ear Aids High-Power

But just hours later, that dream turned into a nightmare. On the Bay Bridge in San Francisco, chaos erupted. A multi-car crash in the Yerba Buena Tunnel left 16 people injured, including eight children. Surveillance footage showed what happened: A Tesla, reportedly in self-driving mode, abruptly slammed on its brakes for no apparent reason. The sudden stop triggered a chain-reaction pileup. One driver described the moment with disbelief: “I thought I was a goner at first. I mean, you see something like that coming towards you at full speed. I thought, ‘Well, this is it.’”

Among the injured was a two-year-old boy. Families who had set out for Thanksgiving celebrations were instead caught in a terrifying, preventable disaster.

At the time, Musk was celebrating the rollout of Tesla’s Full Self-Driving Beta as a technological milestone. Yet, troubling reports were already surfacing. Many Tesla owners had begun sharing frightening experiences of their cars slamming on the brakes unexpectedly—a glitch some called “phantom braking.”

The incident on the Bay Bridge was just the start of a bigger question: Are self-driving cars really safer than humans behind the wheel? With over 280 million vehicles on U.S. roads, the stakes are high. Car accidents remain a devastatingly common problem, but automating the act of driving comes with its own complications.

Tesla driver blames self-driving mode for 8-vehicle crash on Bay Bridge

To understand these risks, the National Highway Traffic Safety Administration (NHTSA) stepped in, launching a program that required automakers to report all crashes involving self-driving or autopilot systems. This transparency allowed investigators to identify troubling trends—particularly with Tesla.

One tragic case involved a Tesla that crashed into a parked fire truck, killing the driver and injuring four firefighters who were on the scene. In another harrowing incident, a high school sophomore in North Carolina was hit by a Tesla operating in autopilot mode while stepping off a school bus.

Earlier this year, NHTSA released findings that linked Tesla’s self-driving systems to hundreds of crashes and dozens of deaths. The headlines were grim: “Tesla’s autopilot and full self-driving linked to hundreds of crashes and dozens of deaths.”

In October, federal officials escalated their scrutiny, launching a formal investigation into Tesla’s self-driving technology. The probe affects more than two million Tesla vehicles on the road. It felt like progress for those concerned about safety, but the conversation has since taken a political turn.

According to Reuters, Elon Musk—who has grown increasingly close to Donald Trump, the president-elect—now stands to benefit from a major policy shift. Trump’s transition team is reportedly planning to eliminate the very rule that requires automakers to disclose crashes involving self-driving systems. The reason given? “Excessive data collection.”

If the rule disappears, Tesla’s troubling crash data might disappear with it. Reuters reports that Tesla accounts for 40 of the 45 fatal incidents reported to the NHTSA. Without these disclosures, the public might never know the true risks.

For Tesla, eliminating the reporting requirement could shield the company from damaging headlines. But for victims and their families—like the 10th grader in North Carolina, the firefighters in Contra Costa County, or the families caught in the Bay Bridge pileup—it leaves critical questions unanswered and problems unsolved.

Reuters also revealed that Musk poured more than $250 million into Trump’s campaign, a financial contribution that raises concerns about whose interests are being prioritized. When billionaires with powerful platforms can influence government policy, where does that leave public safety?

For now, the crash reporting rule still stands. But its future—much like the promise of self-driving technology—remains uncertain. What started as an exciting Thanksgiving surprise two years ago has morphed into a pressing debate about accountability, safety, and the hidden costs of innovation.

The stakes go far beyond the roads. Elon Musk’s decision to shut off Starlink satellites during a critical moment in Ukraine’s war sent shockwaves around the world. That single choice, reportedly influenced by private negotiations, left lives hanging in the balance. It’s hard not to wonder: could something similar happen with Tesla’s self-driving cars, which are increasingly tied to Musk’s Starlink Gen 2 satellites?

This issue goes far beyond technology—it’s about trust and accountability. When one person holds immense control over systems that millions of people depend on, the risks are too significant to ignore.

It’s a concern that sparks chilling questions for critics of powerful figures like Vladimir Putin. Imagine this scenario: “Think twice before driving a Tesla.”

Why? Tesla vehicles are increasingly connected to Elon Musk’s Starlink satellites. In a world where technology can be weaponized, could an authoritarian leader exploit these connections to make life-and-death decisions?

Consider the mystery surrounding the alleged remote disabling of a Cybertruck reportedly tied to controversial Chechen leader Ramzan Kadyrov. If there’s any truth to this story, it raises a profound question: Who gave the order? Was it Vladimir Putin, or was it Elon Musk?

This isn’t just speculation—it’s a stark reminder of the dangers that arise when vast technological power is concentrated in the hands of one individual. Lives, autonomy, and safety are all at stake.

Copyright 2024 FN, NewsRoom.

Views
339,701

error: Content is protected !!