Tesla Under the Microscope Again Over Autopilot Safety Reporting

Tesla Under the Microscope Again Over Autopilot Safety Reporting

The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a fresh compliance review into Tesla, focusing on the company’s late reporting of crashes involving vehicles using Autopilot and Full Self-Driving (FSD) features.

The investigation, logged under action number AQ25002, centers on whether Tesla violated federal rules that require automakers to report certain crash data within strict timeframes — either one day for fatal or injury-related incidents, or five days for others.

According to NHTSA, Tesla submitted dozens of reports months after the actual crashes occurred, with some delays stretching well beyond the required window. In several cases, the agency noted that reports were sent in large batches or rolled out gradually — raising concerns about consistency and transparency.

Late Reports, Big Questions

Under the Standing General Order 2021-01, vehicle manufacturers must quickly share crash details when advanced driver-assistance systems (ADAS) like Autopilot or FSD are in use at the time of an incident. The goal is to help regulators spot safety trends early and act fast if needed.

But NHTSA’s findings suggest Tesla failed to meet those deadlines repeatedly. Some crashes from early 2024 weren’t reported until mid-2025 — long after the agency expected them.

Tesla responded by saying the delays were due to an internal data collection issue, which it claims has since been resolved. The company insists it’s now reporting incidents in a timely manner.

In reply, NHTSA stated:

“NHTSA is opening this Audit Query — a standard process for reviewing compliance — to evaluate the cause of the potential delays, the scope of any missed reports, and the corrective actions Tesla has put in place.”

The audit will also check whether any incident reports are still missing and whether the data submitted is accurate and complete.

📦 Scope of the Investigation

This review covers 37 different Tesla models, including the Model 3, Model S, Model X, and Model Y, with model years ranging from 2016 to 2026. That wide range reflects both the long-standing use of Autopilot and the ongoing expansion of FSD across the fleet.

While no safety defects have been confirmed yet, the probe adds to growing regulatory scrutiny of how Tesla manages real-world performance data from its driver-assist systems.

📊 Tesla Tops the List — By a Huge Margin

Data shared by NHTSA and cited by Electrek shows that Tesla accounts for the vast majority of reported crashes involving Level 2 ADAS systems.

As of the latest report:

  • Tesla: Over 2,300 crash reports
  • General Motors (Super Cruise): 55
  • Subaru (EyeSight): 53

This doesn’t necessarily mean Tesla’s systems are less safe — the company has far more vehicles on the road using ADAS regularly. But the sheer volume of incidents raises valid questions about usage patterns, driver behavior, and oversight.

🧩 A History of Tension with Regulators

This isn’t the first time Tesla has found itself at odds with NHTSA.

  • In November 2024, the agency opened a probe into whether names like “Autopilot” and “Full Self-Driving” mislead drivers about what the systems can actually do — a concern that’s been raised for years.
  • More recently, Tesla sparked controversy by requesting confidentiality for its responses to NHTSA questions about its Robotaxi pilot program in Austin, which launched with safety drivers present.

Critics argue that such requests limit public transparency, especially as these systems operate on public roads.

🔍 What Comes Next?

An Audit Query like this isn’t a formal finding of wrongdoing — it’s a compliance check. But if NHTSA discovers systemic failures or intentional delays, it could lead to fines, mandatory process changes, or even recalls.

For now, the focus is on understanding:

  • What caused the reporting delays?
  • How many incidents were affected?
  • Has Tesla truly fixed the problem?

As driver-assist technology becomes more common, regulators are making it clear: fast, accurate reporting isn’t optional — it’s essential for public safety.

💡 Final Thoughts: Transparency in the Age of AI Driving

Tesla continues to lead in real-world deployment of ADAS tech, but leadership comes with responsibility. With millions of miles driven on Autopilot and FSD every day, timely and honest reporting isn’t just a legal obligation — it’s a cornerstone of trust.

This latest audit may not result in immediate penalties, but it sends a strong message: innovation must go hand-in-hand with accountability.

Similar Posts