- Tesla’s Autopilot is sometimes being enabled on unsuitable roads, The Washington Post reported.
- Out of 40 fatal or serious crashes since 2016, eight occurred on unsuitable roads, the report said.
- Tesla is facing growing legal and regulatory pressure over its assisted-driving technology.
Elon Musk’s Autopilot headache isn’t going away.
Tesla’s infamous assisted-driving software is sometimes being enabled on roads for which it isn’t designed, The Washington Post reported.
Citing NHTSA data and information from lawsuits, The Post reported that since 2016, there have been 40 fatal or serious accidents involving Autopilot software.
At least eight happened on roads where Autopilot was not designed to be used, the report said.
In one such crash in 2019, a Tesla driving on Autopilot drove through a T intersection section, killing one pedestrian and badly injuring another after ignoring multiple warning signs, The Post reported.
It added reported that police bodycam footage showed the Tesla driver telling police that he had been “driving on cruise.”
The crash occurred on a road where Autopilot should not have been enabled in the first place, the publication reported, citing dashcam footage that it had examined.
Other similar incidents included a 2016 crash in Florida where a Tesla drove under a semi-truck, killing the driver, and a crash in March of this year that saw a Tesla hit a teenager as they got off a school bus.
Tesla said on its website that Autopilot is capable of cruise control, steering within clearly marked lanes, and automatic lane changes. It added that the feature is designed to be used on “controlled-access” highways with central dividers and clear lane markings.
The company has previously told the National Transportation Safety Board that “the driver determines the acceptable operating environment.”
The new details add to growing pressure on Tesla over Autopilot, with Elon Musk’s automaker facing numerous legal threats and regulatory investigations over the technology.
Last month, a judge overseeing one of those cases ruled that there was “reasonable evidence” that Musk and other Tesla executives knew that Autopilot was defective but allowed it to be sold anyway.
Tesla has already won a case this year over allegations that Autopilot led to the death of a Tesla driver in 2019.
Tesla did not immediately respond to a request for comment from Business Insider, made outside normal working hours.