News
Tesla Autopilot Shuts Off Seconds Before Impact – What Does It Mean for Safety?
A new test shows Tesla's Autopilot disengaging just before a crash, raising concerns about its reliability. NHTSA has investigated similar incidents. Read the full report now.
Tesla is once again in the spotlight, not for groundbreaking innovations but due to a controversial aspect of its Autopilot system. A video released by engineer and YouTuber Mark Rober clearly shows a Tesla Model Y’s Autopilot disengaging mere fractions of a second before crashing into an obstacle—without making any attempt to brake. This discovery has sparked heated debates while also inadvertently shedding light on a long-standing issue: a system that may recognize danger but fails to react accordingly.
The controversy began with a comparative test of advanced driver-assistance systems (ADAS). A vehicle equipped with lidar-based technology successfully detected obstacles in foggy and rainy conditions, while Tesla’s camera-dependent Autopilot showed less impressive results. However, the real uproar came from the final test: the Tesla was directed toward a fake wall, and just as it approached impact, Autopilot suddenly disengaged.
Tesla enthusiasts quickly jumped to defend the company, claiming that Autopilot was never activated in the test. But this argument collapsed almost instantly. The footage clearly shows that Autopilot was on—until it mysteriously turned off moments before the crash. In an ironic twist, the very people trying to shield Tesla from criticism inadvertently highlighted a problem already identified by NHTSA: Autopilot’s tendency to disengage seconds before an unavoidable accident.
This raises serious concerns about Tesla’s transparency in reporting its Autopilot performance. A previous NHTSA investigation revealed that in 16 crashes involving Autopilot, the system disengaged less than a second before impact. This prompts the question: is Autopilot truly designed to maximize safety, or is Tesla leveraging this behavior to sidestep regulations and reduce legal liability?
Further concern comes from NHTSA’s recent findings suggesting that Tesla might be misleading customers by exaggerating the capabilities of its autonomous driving features. The agency has already recommended software updates to improve Autopilot’s safety, but whether Tesla will make meaningful changes remains uncertain.
Meanwhile, Tesla owners continue to share mixed experiences. Some praise the system’s convenience, while others report instances of phantom braking and sudden, unexplained slowdowns. Yet one question unites all drivers: how safe is a system that turns off at the worst possible moment?
If evidence emerges that Tesla is intentionally programming Autopilot to disengage before crashes, it could lead to further investigations, lawsuits, and regulatory crackdowns. The coming months may determine whether Tesla’s self-driving technology evolves into a truly reliable system—or remains a controversial feature with a troubled reputation.
Source: YouTube
2025, Mar 17 16:14