Tesla had numerous questions about its autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake a truck that was crossing a street.

Now the company is facing closer scrutiny than it has been in the last five years for Autopilot, which Tesla and its CEO Elon Musk have long kept to make their cars safer than other vehicles. Federal officials are investigating a number of accidents involving Teslas where either autopilot has been used or may have been used.

The National Highway Traffic Safety Administration confirmed last week that it is investigating 23 such accidents. In an accident this month, a Tesla Model Y left a police car that stopped on a freeway near Lansing, Michigan. The driver, who was not seriously injured, had used autopilot, the police said.

In February in Detroit, under circumstances similar to the 2016 Florida accident, a Tesla drove under a truck crossing the street and tore the roof off the car. The driver and a passenger were seriously injured. Officials did not say whether the driver turned on autopilot.

NHTSA is also investigating a February 27 crash near Houston in which a Tesla hit a police vehicle stopped on a freeway. It’s not clear if the driver used autopilot. The car did not appear to be slowing down before impact, police said.

Autopilot is a computerized system that uses radar and cameras to detect lane markings, other vehicles and objects on the road. It can automatically steer, brake and accelerate with little input from the driver. Tesla has said it should only be used on shared highways, but videos on social media show drivers using autopilot on different types of roads.

“We need to see the results of the investigation first, but these incidents are the latest examples showing that these advanced cruise control features that Tesla has are not very good at detecting and then stopping a vehicle that has been stopped in highway conditions . ” said Jason Levine, executive director of the Center for Auto Safety, a group founded in the 1970s by Consumers Union and Ralph Nader.

This re-examination comes at a critical time for Tesla. After hitting a record high this year, the share price fell around 20 percent, suggesting the company’s electric cars are losing market share to traditional automakers. Ford Motor’s Mustang Mach E and Volkswagen ID.4 recently hit showrooms and are considered serious challengers to the Model Y.

The result of the current research is important not only for Tesla, but also for other technology and auto companies working on autonomous cars. While Mr Musk has frequently suggested that widespread use of these vehicles is near, Ford, General Motors and Waymo, a division of Google’s parent company Alphabet, have said the moment could be years or even decades away.

Bryant Walker Smith, a professor at the University of South Carolina who has advised the federal government on automated driving, said it was important to develop advanced technologies to reduce road deaths, which are now around 40,000 a year. But he said he had concerns about autopilot and how Tesla’s name and marketing imply that drivers can safely turn their attention off the road.

“There’s an incredible disconnect between what the company and its founder say and make people believe and what their system is actually capable of,” he said.

Tesla, which has closed its public relations department and generally does not respond to inquiries from reporters, has not returned calls or emails looking for comments. And Mr Musk didn’t respond to questions sent to him on Twitter.

The company has not publicly addressed the recent crashes. While it can tell if the autopilot was on at the time of the accident because its cars are constantly sending data to the company, it did not tell whether the system was being used.

The company has argued that its cars are very safe, claiming that its own data shows Teslas have fewer accidents per mile driven and even fewer when autopilot is used. It also says that when using autopilot, drivers must pay close attention to the road and always be ready to regain control of their cars.

A federal investigation into the 2016 Florida fatal accident found that autopilot had failed to detect a white semi-trailer against a bright sky and that the driver could use it when not on a freeway. Autopilot continued to run the car at 74 mph, despite the fact that the driver, Joshua Brown, ignored several warnings to keep his hands on the wheel.

A second fatal incident occurred in Florida in 2019 under similar circumstances – a Tesla crashed into a tractor-trailer while the autopilot was on. Investigators found that the driver had no hands on the steering wheel before the impact.

While NHTSA did not force Tesla to recall autopilot, the National Transportation Safety Board concluded that the system “played an important role” in the 2016 Florida accident. It also said the technology hadn’t put in place any safety precautions to prevent drivers from taking their hands off the steering wheel or looking away from the road. The Safety Board came to similar conclusions when investigating a 2018 accident in California.

In comparison, a similar GM system, Super Cruise, monitors a driver’s eyes and turns off if the person looks away from the road for more than a few seconds. This system can only be used on major motorways.

In a February 1 letter, National Transportation Safety Board chairman Robert Sumwalt criticized NHTSA for failing to do more to evaluate autopilot and called on Tesla to add safeguards to prevent drivers from abusing the system .

The new administration in Washington could take a safer line on security. The Trump administration did not try to impose many regulations on autonomous vehicles and tried to simplify other rules that the auto industry did not like, including fuel economy standards. In contrast, President Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked on the California Air Resources Board, which often clashed with the Trump administration over regulations.

Autopilot concerns could deter some car buyers from paying Tesla for a more advanced version, Full Self-Driving, which the company sells for $ 10,000. Many customers paid for it expecting to use it in the future. Tesla rolled out the option for about 2,000 cars in a “beta” or trial version as of late last year, and Mr. Musk recently said the company will make it available to more cars soon. Full Self Driving is designed to be able to operate Tesla cars in cities and on local roads where driving conditions become more complex due to oncoming traffic, intersections, traffic lights, pedestrians and cyclists.

Despite their names, autopilot and full self-driving have major limitations. Their software and sensors cannot control cars in many situations, so drivers have to keep their eyes on the road and keep their hands on or near the steering wheel.

In a recent November letter to the California Department of Motor Vehicles, a Tesla attorney admitted that full self-driving had difficulty responding to a variety of driving situations and should not be viewed as a fully autonomous driving system.

The system is unable to recognize or respond to certain “circumstances and events”, ”wrote Eric C. Williams, Associate General Counsel of Tesla. “This includes static objects and road waste, emergency vehicles, construction zones, large uncontrolled intersections with several incoming routes, congestion, adverse weather, complicated or controversial vehicles on the driveways and unmapped roads.”

Mr. Levine of the Center for Auto Safety has complained to federal agencies that the names autopilot and full self-driving are misleading at best and may encourage some drivers to be reckless.

“Autopilot suggests that the car can drive itself and, more importantly, stop itself,” he said. “And they’ve doubled in with full self-driving, and that in turn leads consumers to believe that the vehicle is capable of doing things it can’t.”