close
close

USA is testing Tesla’s “Full Self-Driving” system

USA is testing Tesla’s “Full Self-Driving” system

DETROIT (AP) — The U.S. government’s Transportation Safety Board is investigating Tesla “Full self-driving” system after reports came in of accidents in low visibility conditions, including one in which a pedestrian was killed.

The National Highway Traffic Safety Administration said in documents that it launched the investigation Thursday after the company reported four crashes in which Teslas encountered sunlight, fog and dust in the air.

In addition to the death of the pedestrian, another accident resulted in personal injuries, the authority said.

Investigators will examine the ability of “fully autonomous driving” to “detect and respond appropriately to poor visibility conditions on the roadway, and if so, the circumstances that contribute to these accidents.”

The study covers around 2.4 million Teslas from model years 2016 to 2024.

A message seeking comment was left Friday from Tesla, in which it repeatedly noted that the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla hosted an event at a Hollywood studio to imagine a completely autonomous robotaxi without a steering wheel or pedals. Musk, who has previously promised autonomous vehicles, said the company plans to let the autonomous Models Y and 3 drive without human drivers next year. Robotaxis without steering wheels would be available starting in California and Texas in 2026, he said.

The impact of the investigation on Tesla’s autonomous driving ambitions is unclear. The NHTSA would have to approve any robotaxi without pedals or a steering wheel, and it’s unlikely that would happen while the investigation is ongoing. However, if the company tries to use autonomous vehicles in its existing models, it would likely be subject to government regulations. There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety regulations.

The NHTSA also said it would investigate whether other similar accidents involving full self-driving in low visibility conditions had occurred and ask the company for information on whether updates would have affected the system’s performance in those conditions.

“This review will specifically evaluate the timing, purpose and capabilities of such updates, as well as Tesla’s assessment of their safety impact,” the documents say.

Tesla reported the four accidents to NHTSA on behalf of the agency responsible for all automakers. According to an agency database, the pedestrian was killed in November 2023 in Rimrock, Arizona, after being struck by a 2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of Phoenix.

The Arizona Department of Public Safety said in a statement that the accident occurred on Interstate 17 just after 5 p.m. on Nov. 27. Two vehicles collided on the highway and blocked the left lane. A Toyota 4Runner stopped and two people got out to assist with traffic control. A red Tesla Model Y then hit the 4Runner and one of the people getting out. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene.

The collision occurred because the sun was shining in the Tesla driver’s eyes, so no charges were filed against the Tesla driver, said Raul Garcia, public information officer for the department. Glare from the sun also played a role in the initial collision, he added.

Tesla has recalled Full Self-Driving twice under pressure from NHTSA, which sought information from law enforcement and the company in July after a Tesla used the system hit and killed a motorcyclist near Seattle.

The recalls occurred because the system was defective programmed to run stop signs at slow speeds and because the system violated other traffic rules. Both issues should be resolved with online software updates.

Critics said Tesla’s system, which uses only cameras to detect hazards, lacks the appropriate sensors to be fully self-driving. Almost all other companies working on autonomous vehicles use not only cameras but also radar and laser sensors to be able to see better in the dark or in poor visibility conditions.

Musk has said that people can only drive with eyesight, so cars should only be able to drive with cameras. He called lidar (light detection and ranging), which uses lasers to detect objects, “stupidity.”

The “Full Self-Driving” recalls came after a three-year investigation into Tesla’s less sophisticated Autopilot system, which crashed into emergency vehicles and other vehicles parked on highways, many with flashing warning lights.

This investigation was closed last April after the agency pressured Tesla to recall its vehicles to strengthen a weak system that ensured drivers were paying attention. A few weeks after the recall, NHTSA began investigating whether the callback worked.

NHTSA began investigating an Autopilot accident in 2021 after receiving 11 reports that Teslas using Autopilot had struck parked emergency vehicles. In documents explaining why the investigation was closed, NHTSA said it ultimately found 467 accidents involving Autopilot, resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “Full Self-Driving,” according to Musk, is capable of driving without human intervention.

The investigation, launched Thursday, breaks new ground for the NHTSA, which had previously viewed Tesla’s systems as assistive drivers rather than self-driving drivers. With the new research, the agency is focusing on the possibilities of “full autonomous driving,” rather than just ensuring drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said Autopilot’s previous investigation did not examine why the Teslas did not see emergency vehicles and fail to stop.

“Before they put the responsibility on the driver rather than the car,” he said. “What they are saying here is that these systems are unable to adequately detect safety risks, regardless of whether drivers are paying attention or not.”

Related Post