Tesla Autopilot recall to be probed by US regulator
- Published
The US auto regulator is investigating whether Tesla's biggest ever recall successfully addressed safety concerns relating to its driver assistance system.
In December, Tesla issued a software update to two million of its vehicles in the US to fix problems with its Autopilot feature.
The National Highway Traffic Safety Administration (NHTSA) says it will now probe the "adequacy" of that fix.
Tesla has been approached for comment.
The NHTSA has just concluded a nearly three-year-long investigation into crashes involving cars fitted with Autopilot.
The agency said there had been at least 13 Tesla crashes, involving at least one death, and many more involving serious injuries, in which "foreseeable driver misuse of the system played an apparent role."
A document, external outlining the NHTSA's new probe says it has identified concerns in Tesla's recall remedy after initial testing of remedied cars and analysing crashes which took place following its implementation.
It also says that "Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it."
The new investigation comes shortly after Tesla recalled thousands of its new Cybertrucks over an accelerator crash risk.
2024 has also seen sales and profits fall at the car-maker, and an on-going row about boss Elon Musk's enormous pay package.
This was shortly after the firm announced layoffs to 10% of its global workforce after delivering less vehicles than investors expected.
Mr Musk told investors in its latest earnings call that new electric vehicle model launches would be brought forward, but also that Tesla should be viewed as more than just a car company.
He added its humanoid robot Optimus "will be more valuable than everything else combined".
What is Autopilot?
Autopilot is designed to assist drivers with steering, acceleration and braking and, despite what its name might suggest, still requires driver input and attention.
Launched in 2015, the software forms part of the firm's wider vision for an autonomous driving future where human input is no longer needed at the wheel.
Elon Musk, Tesla's largest shareholder, has previously suggested it can drive more safely than humans in some situations.
It requires drivers to have their hands on the wheel and to be "fully attentive".
But following its Autopilot investigation into Tesla crashes, the NHTSA said it found "the prominence and scope of the feature's controls may not be sufficient to prevent driver misuse".
In early April, Tesla agreed to settle a lawsuit over a crash in 2018 which killed Apple engineer Walter Huang after his Model X, operating on Autopilot, collided with a highway barrier.
The BBC has previously heard from former Tesla employees who had raised concerns over the safety of its vehicles and software.
Whistleblower Lukasz Krupski told the BBC in December he did not believe the technology powering the firm's vehicles was safe.
- Published16 April
- Published19 April
- Published9 April