Tesla Autopilot: US opens official investigation into self-driving tech
- Published
The US federal agency in charge of road safety is opening an official investigation into Tesla's "self-driving" Autopilot system.
The National Highway Traffic Safety Administration (NHTSA) said it was acting following 11 Tesla crashes since 2018 involving emergency vehicles.
In some cases, the Tesla vehicles "crashed directly into the vehicles of first responders", it said.
The investigation will cover roughly 765,000 Tesla cars made since 2014.
That includes those in the Model Y, Model X, Model S and Model 3, the NHTSA said - the entire current range.
'Control at all times'
The agency was primarily concerned with an apparent inability of Tesla vehicles to cope with vehicles stopped in the road - specifically emergency vehicles attending an incident.
Among the list of cases was one where a Tesla "ploughed into the rear" of a parked fire engine attending an accident, and another in which a parked police car was struck.
The NHTSA said it was opening its preliminary investigation into "the technologies and methods used to monitor, assist, and enforce the driver's engagement", while using Autopilot.
It said that in the 11 crashes that prompted its investigation, either Autopilot or a system called Traffic Aware Cruise Control had been active "just prior" to the collisions.
The assistive technology allows the car to automatically steer, accelerate and brake.
But it has come under fire for being misleading, as it does not automatically drive the car and drivers are required to maintain control and attention at all times.
Tesla has marketed the feature as an "Autopilot" and promised "full self-driving", which is now available to some users in a beta version.
Users have abused the system frequently in the past, with examples ranging from using their phones while the car drives unattended to switching car seats and leaving no driver at the wheel.
In a statement, an NHTSA spokesperson said: "No commercially available motor vehicles today are capable of driving themselves. Every available vehicle requires a human driver to be in control at all times."
The investigation's supporting documents do, however, note the challenging circumstances involved in many of the collisions.
"Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones," it reads.
It comes days ahead of an event to showcase the car company's software.
Chief executive Elon Musk had previously announced 19 August as "Tesla AI Day", which he said would showcase the progress of the firm's artificial intelligence systems - with a view to recruiting AI experts to the firm.
Tesla disbanded its public relations team in October 2020 and cannot be reached for comment.
- Published30 May 2018
- Published23 April 2021
- Published15 July 2020
- Published26 February 2020