Post by : Monika
The United States government has opened a new safety investigation into nearly 2.9 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) software. The investigation is being carried out by the National Highway Traffic Safety Administration (NHTSA) after receiving reports that Tesla cars using FSD mode may be breaking traffic laws such as running red lights, making unsafe lane changes, or entering intersections illegally.
The agency said that it has already identified several incidents where cars operating under the FSD system might have violated standard road rules and caused accidents. This investigation is another major test for Tesla, which has faced years of scrutiny over the safety of its driver-assist technology.
What Triggered the Investigation
According to NHTSA, the decision to open a formal probe came after multiple reports from drivers and several crash investigations suggested that Tesla’s Full Self-Driving mode may not always obey traffic laws.
The agency said it has reviewed six crashes involving Tesla vehicles that were being driven in FSD mode when they entered intersections on red lights. Out of these six accidents, four resulted in injuries. None of the crashes were fatal, but the agency described the pattern as “a serious concern regarding safety and rule compliance.”
In addition to the crash reports, NHTSA has also received 18 consumer complaints and has seen at least one detailed media report claiming that Tesla’s FSD system sometimes makes risky or incorrect decisions when approaching intersections. These include failing to come to a full stop, making sudden turns without signaling, or accelerating unexpectedly.
What the Full Self-Driving System Does
Tesla’s Full Self-Driving (FSD) software is marketed as an advanced driver-assistance system designed to help drivers manage everyday tasks such as changing lanes, navigating city streets, and stopping at traffic lights.
However, despite the name “Full Self-Driving,” the system does not make Tesla cars fully autonomous. Drivers are required to stay alert and keep their hands on the steering wheel at all times. Tesla itself has said that the system is only meant to “assist” the driver and that users must be ready to take control immediately if something goes wrong.
Critics have argued for years that the system’s name can be misleading because it gives drivers the impression that the car can drive itself completely — which it cannot.
Tesla’s Track Record with Safety Investigations
This new investigation adds to a growing list of safety probes that Tesla has faced in recent years.
In December 2024, NHTSA opened an investigation into over 2.4 million Tesla vehicles after reports that they were involved in crashes when the driver-assistance features were used in poor visibility conditions, such as fog, glare, or dust. That probe looked at how Tesla’s system handled difficult driving environments.
Earlier in 2023, Tesla was required to recall more than 2 million vehicles to improve how drivers were alerted to keep their hands on the wheel when using the Autopilot system, another driver-assist function.
Concerns About Traffic Law Violations
The main issue in this current investigation is whether Tesla’s FSD software causes cars to disobey traffic rules — for example, by crossing intersections on red lights or failing to yield to oncoming vehicles.
NHTSA said in its statement that such violations suggest “the system may not perform consistently with traffic safety laws.” Investigators are studying whether the software’s programming, decision-making, or sensor performance might be responsible for these risky moves.
Traffic experts say that even small mistakes by self-driving systems can lead to major accidents because these cars move quickly and may not react in time when faced with sudden obstacles.
Possible Outcomes of the Investigation
The current stage of the investigation is called a “preliminary evaluation.” This is the first step in NHTSA’s process to determine whether a safety defect exists.
If the agency finds sufficient evidence that Tesla’s FSD software poses a risk to public safety, the next step would be an “engineering analysis.” That could eventually lead to a recall of the affected vehicles — in this case, nearly 2.9 million cars.
A recall would require Tesla to fix the issue, either through a software update or by disabling problematic features of the FSD system.
Tesla’s Response
As of now, Tesla has not issued any public statement about the investigation. The company, which has dissolved its media relations department in recent years, usually communicates through posts by its CEO Elon Musk or official company updates on social media.
Elon Musk has previously defended the FSD system, saying that it improves road safety by reducing human error. He has also argued that accidents involving Teslas often receive more media attention than those caused by human drivers, creating a false impression that the technology is unsafe.
However, safety experts point out that any system that can break traffic laws or make unsafe decisions still poses a serious threat, especially when used by millions of drivers on public roads.
What Experts Say
Many road safety specialists have welcomed the investigation, saying that the public deserves clear information about how Tesla’s FSD performs in real traffic.
Jason Levine, the director of the Center for Auto Safety, said, “Tesla’s use of the term ‘Full Self-Driving’ has long been misleading. We need solid data on whether the system truly improves safety or introduces new risks.”
Others have argued that technology can only help reduce accidents if it is transparent, tested, and properly regulated. Without these safeguards, they warn, the promise of self-driving cars could turn into a public danger.
Impact on Tesla and Its Customers
This investigation could have a significant impact on Tesla’s image, its customers, and the company’s stock value.
For Tesla owners, it may create uncertainty and worry about whether their vehicles are safe. If a recall happens, it could mean temporary restrictions or mandatory updates to their car’s software.
For Tesla as a company, the investigation could also affect public trust in its technology. Tesla is known for innovation, but its self-driving systems have often been criticized for being tested on public roads before being fully perfected.
If NHTSA determines that FSD consistently violates traffic laws, it could lead to lawsuits or stricter rules for all carmakers developing similar driver-assist systems.
A Broader Debate About Automation
This case also highlights a larger debate about how fast self-driving technology should develop. While many people believe autonomous driving will eventually make roads safer by reducing human errors, others worry that moving too quickly — without clear safety standards — could endanger lives.
In the U.S., regulators are still deciding how to manage and test systems like Tesla’s FSD, Waymo’s autonomous cars, and other self-driving projects. Each of these technologies promises convenience and innovation, but they also raise serious ethical and safety questions.
What Happens Next
The NHTSA will continue to collect data from Tesla owners, police reports, and accident databases to understand the extent of the problem. Investigators will likely test the FSD system under various road conditions to see if the software makes the same mistakes.
Once the evaluation is complete, the agency will decide whether further action is necessary. This could include new rules for all companies using advanced driver-assist technology.
Until then, Tesla owners are advised to remain cautious when using Full Self-Driving mode. They must keep their hands on the wheel and eyes on the road at all times, even if the car seems to be driving itself.
The investigation into Tesla’s Full Self-Driving software is one of the most important safety reviews in recent years. It will help determine not only the future of Tesla but also the future of self-driving technology around the world.
While Tesla continues to promise that its vehicles are safer and smarter than ever, the recent reports suggest there is still a long way to go before cars can truly drive themselves without human help.
Paramount+ Wins Five-Year Deal to Stream PBR’s 'Unleash the Beast'
Paramount+ has inked a five-year streaming agreement to carry PBR's Unleash the Beast live from Dece
Zohran Mamdani's Historic NYC Win Marked by Bollywood Finale
Zohran Mamdani captured the New York mayoralty — the city's first Muslim and South Asian mayor — and
Indian Tennis Veteran Rohan Bopanna Ends Illustrious Career
Rohan Bopanna retires from tennis at 45 after winning two Grand Slams, becoming world number one, an
Babar Azam Becomes Top Run Scorer In T20I Cricket History
Pakistan’s Babar Azam has overtaken India’s Rohit Sharma to become the highest run-scorer in men’s T
BTS Comeback 2026 Group Plans Biggest-Ever Global Tour
BTS is set for a long-awaited comeback in 2026, followed by a massive 65-city world tour. Fans hope
India Stuns Australia to Reach Women’s World Cup Final
India shocked seven-time champions Australia in the Women’s World Cup semi-final, chasing 339 runs w