U.S. Opens Probe into Nearly 2.9 Million Tesla Vehicles

U.S. Opens Probe into Nearly 2.9 Million Tesla Vehicles

Post by : Monika

The United States government has opened a new safety investigation into nearly 2.9 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) software. The investigation is being carried out by the National Highway Traffic Safety Administration (NHTSA) after receiving reports that Tesla cars using FSD mode may be breaking traffic laws such as running red lights, making unsafe lane changes, or entering intersections illegally.

The agency said that it has already identified several incidents where cars operating under the FSD system might have violated standard road rules and caused accidents. This investigation is another major test for Tesla, which has faced years of scrutiny over the safety of its driver-assist technology.

What Triggered the Investigation

According to NHTSA, the decision to open a formal probe came after multiple reports from drivers and several crash investigations suggested that Tesla’s Full Self-Driving mode may not always obey traffic laws.

The agency said it has reviewed six crashes involving Tesla vehicles that were being driven in FSD mode when they entered intersections on red lights. Out of these six accidents, four resulted in injuries. None of the crashes were fatal, but the agency described the pattern as “a serious concern regarding safety and rule compliance.”

In addition to the crash reports, NHTSA has also received 18 consumer complaints and has seen at least one detailed media report claiming that Tesla’s FSD system sometimes makes risky or incorrect decisions when approaching intersections. These include failing to come to a full stop, making sudden turns without signaling, or accelerating unexpectedly.

What the Full Self-Driving System Does

Tesla’s Full Self-Driving (FSD) software is marketed as an advanced driver-assistance system designed to help drivers manage everyday tasks such as changing lanes, navigating city streets, and stopping at traffic lights.

However, despite the name “Full Self-Driving,” the system does not make Tesla cars fully autonomous. Drivers are required to stay alert and keep their hands on the steering wheel at all times. Tesla itself has said that the system is only meant to “assist” the driver and that users must be ready to take control immediately if something goes wrong.

Critics have argued for years that the system’s name can be misleading because it gives drivers the impression that the car can drive itself completely — which it cannot.

Tesla’s Track Record with Safety Investigations

This new investigation adds to a growing list of safety probes that Tesla has faced in recent years.

In December 2024, NHTSA opened an investigation into over 2.4 million Tesla vehicles after reports that they were involved in crashes when the driver-assistance features were used in poor visibility conditions, such as fog, glare, or dust. That probe looked at how Tesla’s system handled difficult driving environments.

Earlier in 2023, Tesla was required to recall more than 2 million vehicles to improve how drivers were alerted to keep their hands on the wheel when using the Autopilot system, another driver-assist function.

Concerns About Traffic Law Violations

The main issue in this current investigation is whether Tesla’s FSD software causes cars to disobey traffic rules — for example, by crossing intersections on red lights or failing to yield to oncoming vehicles.

NHTSA said in its statement that such violations suggest “the system may not perform consistently with traffic safety laws.” Investigators are studying whether the software’s programming, decision-making, or sensor performance might be responsible for these risky moves.

Traffic experts say that even small mistakes by self-driving systems can lead to major accidents because these cars move quickly and may not react in time when faced with sudden obstacles.

Possible Outcomes of the Investigation

The current stage of the investigation is called a “preliminary evaluation.” This is the first step in NHTSA’s process to determine whether a safety defect exists.

If the agency finds sufficient evidence that Tesla’s FSD software poses a risk to public safety, the next step would be an “engineering analysis.” That could eventually lead to a recall of the affected vehicles — in this case, nearly 2.9 million cars.

A recall would require Tesla to fix the issue, either through a software update or by disabling problematic features of the FSD system.

Tesla’s Response

As of now, Tesla has not issued any public statement about the investigation. The company, which has dissolved its media relations department in recent years, usually communicates through posts by its CEO Elon Musk or official company updates on social media.

Elon Musk has previously defended the FSD system, saying that it improves road safety by reducing human error. He has also argued that accidents involving Teslas often receive more media attention than those caused by human drivers, creating a false impression that the technology is unsafe.

However, safety experts point out that any system that can break traffic laws or make unsafe decisions still poses a serious threat, especially when used by millions of drivers on public roads.

What Experts Say

Many road safety specialists have welcomed the investigation, saying that the public deserves clear information about how Tesla’s FSD performs in real traffic.

Jason Levine, the director of the Center for Auto Safety, said, “Tesla’s use of the term ‘Full Self-Driving’ has long been misleading. We need solid data on whether the system truly improves safety or introduces new risks.”

Others have argued that technology can only help reduce accidents if it is transparent, tested, and properly regulated. Without these safeguards, they warn, the promise of self-driving cars could turn into a public danger.

Impact on Tesla and Its Customers

This investigation could have a significant impact on Tesla’s image, its customers, and the company’s stock value.

For Tesla owners, it may create uncertainty and worry about whether their vehicles are safe. If a recall happens, it could mean temporary restrictions or mandatory updates to their car’s software.

For Tesla as a company, the investigation could also affect public trust in its technology. Tesla is known for innovation, but its self-driving systems have often been criticized for being tested on public roads before being fully perfected.

If NHTSA determines that FSD consistently violates traffic laws, it could lead to lawsuits or stricter rules for all carmakers developing similar driver-assist systems.

A Broader Debate About Automation

This case also highlights a larger debate about how fast self-driving technology should develop. While many people believe autonomous driving will eventually make roads safer by reducing human errors, others worry that moving too quickly — without clear safety standards — could endanger lives.

In the U.S., regulators are still deciding how to manage and test systems like Tesla’s FSD, Waymo’s autonomous cars, and other self-driving projects. Each of these technologies promises convenience and innovation, but they also raise serious ethical and safety questions.

What Happens Next

The NHTSA will continue to collect data from Tesla owners, police reports, and accident databases to understand the extent of the problem. Investigators will likely test the FSD system under various road conditions to see if the software makes the same mistakes.

Once the evaluation is complete, the agency will decide whether further action is necessary. This could include new rules for all companies using advanced driver-assist technology.

Until then, Tesla owners are advised to remain cautious when using Full Self-Driving mode. They must keep their hands on the wheel and eyes on the road at all times, even if the car seems to be driving itself.

The investigation into Tesla’s Full Self-Driving software is one of the most important safety reviews in recent years. It will help determine not only the future of Tesla but also the future of self-driving technology around the world.

While Tesla continues to promise that its vehicles are safer and smarter than ever, the recent reports suggest there is still a long way to go before cars can truly drive themselves without human help.

Oct. 9, 2025 3:30 p.m. 605
Soha Ali Khan Reflects on Family Gatherings and Her Family’s Legacy
Nov. 6, 2025 6:07 p.m.
Soha Ali Khan recounts intimate family dinners, memories of her parents and observations on her siblings’ different creative journeys.
Read More
Prosus Signals Interest in Mobile.de Amid Rising Investor Activity
Nov. 6, 2025 6:03 p.m.
Prosus has voiced preliminary interest in Mobile.de as owners weigh a partial sale or IPO, drawing several bidders to Europe’s used-car classifieds.
Read More
League Penalises Panthers RB Rico Dowdle for End-Zone Gesture
Nov. 6, 2025 5:58 p.m.
The NFL issued a $14,491 fine to Carolina RB Rico Dowdle for an end-zone celebration; he has launched a fundraiser and intends to appeal.
Read More
White House Plans Could Cut GLP-1 Costs, Reshape U.S. Weight-Loss Market
Nov. 6, 2025 5:58 p.m.
A proposed U.S. initiative aims to lower GLP‑1 prices and expand Medicare/Medicaid coverage, altering competition between Eli Lilly and Novo Nordisk.
Read More
Gaurav Khanna in Heated Captaincy Standoff on Bigg Boss 19
Nov. 6, 2025 5:50 p.m.
Gaurav Khanna answered repeated taunts during the Bigg Boss 19 captaincy task, staking his claim amid shifting house alliances and looming nominations.
Read More
FAA Orders Expand as U.S. Shutdown Disrupts Holiday Air Travel
Nov. 6, 2025 5:49 p.m.
FAA-mandated flight reductions are disrupting major U.S. airports as a prolonged government shutdown strains air traffic control and TSA staffing.
Read More
India Defeat Australia by 48 Runs in Fourth T20 to Lead Series 2-1
Nov. 6, 2025 5:47 p.m.
India beat Australia by 48 runs in the fourth T20, moving 2-1 up in the five-match series thanks to strong batting, disciplined bowling and sharp fielding.
Read More
Cardiac Surgeon: Strength Training and Protein Vital for Aging Muscles
Nov. 6, 2025 5:36 p.m.
A cardiac surgeon warns ageing blunts muscle response; resistance training and sufficient protein help older adults preserve strength and independence.
Read More
Delhi Winter Raises Lung Risk, Pulmonologist Issues Global Health Alert
Nov. 6, 2025 5:35 p.m.
Dense winter smog and low temperatures in Delhi combine to strain lungs, heightening cough, chest tightness and breathing problems.
Read More
Sponsored
Trending News