Tesla hit parked police car 'while using Autopilot'

  • Published
Crashed TeslaImage source, Laguna Beach Police Department
Image caption,

A number of Tesla vehicles have been involved in crashes.

A Tesla car has crashed into a parked police car in California.

The driver suffered minor injuries and told police she was using the car's driver-assisting Autopilot mode.

The crash has similarities to other incidents, including a fatal crash in Florida where the driver's "over-reliance on vehicle automation" was determined as a probable cause.

Tesla has said customers are reminded they must "maintain control of the vehicle at all times".

In a statement, it added: "When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel."

As yet, it has still to be confirmed that the Autopilot mode was indeed engaged.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Laguna Beach PD PIO

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Laguna Beach PD PIO

The California crash appears to be the latest example of semi-autonomous vehicles struggling to detect stationary objects. A Tesla driving in Autopilot hit a stationary fire engine in Utah in May.

According to a police report obtained by the Associated Press, the Tesla accelerated before it hit the vehicle.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post 2 by You You Xue 🇪🇺

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post 2 by You You Xue 🇪🇺

It has also emerged that a Tesla Model 3 driver has blamed Autopilot for a crash in Greece last Friday, in which the car suddenly veered right "without warning".

The motorist, You You Xue, voiced his concerns about Autopilot on Facebook.

"The vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally," he wrote.

One influential tech industry-watcher has raised concern about Tesla's software, noting that Google's car division has claimed that an all-or-nothing approach is safer.

"There is a serious argument that the incremental, 'level 2/3' approach to autonomous cars followed by Tesla, where the human isn't driving but might have to grab the wheel at any time, is actively dangerous and a technical dead end," tweeted, external, a partner at the venture capital firm Andreessen Horowitz.

"Waymo decided not to do this at all."

It is not the first time the Autopilot feature has been linked to dangerous behaviour.

In England, a driver was banned from driving after putting his Tesla in Autopilot on the M1 and sitting in the passenger seat.

Media caption,

Bhavesh Patel was filmed by a passenger in another car

'Deceptive' naming

The news comes after two US rights groups urged the Federal Trade Commission to investigate Tesla over its marketing of the assisted driving software.

The Center for Auto Safety and Consumer Watchdog said it should be "reasonable" for Tesla owners to believe that their car should be able to drive itself on Autopilot.

It called the naming of the Autopilot "deceptive and misleading".

Media questions

The chief executive of Tesla, Elon Musk, has previously complained abut media attention on Tesla crashes. He tweeted:, external "It's super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage."

His comments received support from prominent academic and psychologist Steven Pinker, who has in the past voiced concerns about Tesla's Autopilot.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post 3 by Steven Pinker

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post 3 by Steven Pinker