Kodlama 27 Nisan 2024

Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse

Tesla Autopilot investigation closed after feds find 13 fatal crashes related to misuse

The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes involving its misuse, including 13 that were fatal and “many more involving serious injuries.”

At the same time, NHTSA is opening a new investigation to evaluate whether the Autopilot recall fix that Tesla implemented in December is effective enough.

NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

“This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities,” the agency wrote. “This gap led to foreseeable misuse and avoidable crashes.”

The closing of the initial probe, which began in 2021, marks an end of one of the most visible efforts by the government to scrutinize Tesla’s Autopilot software. Tesla is still feeling the pressure of multiple other inquiries, though.

The Department of Justice is also investigating the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and the more-advanced Full Self-Driving beta software. The company is also facing multiple lawsuits regarding Autopilot. Tesla, meanwhile, is now going “balls to the wall for autonomy,” according to CEO Elon Musk.

NHTSA said its investigation reviewed 956 reported crashes up until August 30, 2023. In roughly half (489) of those, the agency said either there “was insufficient data to make an assessment,” the other vehicle was at fault, Autopilot was found to not be in use or the crash was otherwise unrelated to the probe.

NHTSA said the remaining 467 crashes fell into three buckets. There were many (211) crashes where “the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash. It said 145 crashes involved “roadway departures in low traction conditions such as wet roadways. And it said 111 of the crashes involved “roadway departures where Autosteer was inadvertently disengaged by the driver’s inputs.”

These crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” the agency wrote.

Tesla tells drivers they need to pay attention to the road and keep their hands on the wheel while using Autopilot, which it measures via a torque sensor and, in its newer cars, the in-cabin camera. But NHTSA, and other safety groups, have said that these warnings and checks do not go far enough. In December, NHTSA said these measures were “insufficient to prevent misuse.”

Tesla agreed to issue a recall via a software update that would theoretically increase driver monitoring. But that update did not really appear to change Autopilot much — a sentiment NHTSA seems to agree with.

Parts of that recall fix require the “owner to opt in,” and Tesla allows a driver to “readily reverse” some of the safeguards, according to NHTSA.

NHTSA spent nearly three years working on the investigation into Autopilot, and met or interacted with Tesla numerous times throughout the process. It performed many direct examinations of the crashes, and relied on the company to provide data about them as well.

But the agency criticized Tesla’s data in one of the supporting documents.

“Gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes. Tesla is not aware of every crash involving Autopilot even for severe crashes because of gaps in telematic reporting,” NHTSA wrote. According to the agency, Tesla “largely receives data from crashes only with pyrotechnic deployment,” meaning when air bags, seat belt pre-tensioners or the pedestrian impact mitigation feature of the car’s hood are triggered.

NHTSA claims that limiting to this level means Tesla is only collecting data on around 18% of crashes that are reported to the police. As a result, NHTSA wrote that the probe uncovered crashes for which Autopilot was engaged that Tesla was not notified of via telematics.

source

Spread the love <3

You may also like...

Ağu
29
2024
0

iPhone 16 beklediğimizden erken ön siparişe açılabilir!

Apple, bu yıl geleneksel takviminden bir kez daha saparak, iPhone 16 serisi için ön siparişleri beklenenden daha erken başlatabilir. Bir...

Spread the love <3
Nis
26
2024
0
Mortgage terminology

Mortgage terminology

adjustable-rate mortgage (noun): A mortgage loan with an unpredictable interest rate which fluctuates periodically based on the state of the...

Spread the love <3
Mar
21
2024
0
NVIDIA CEO’sundan şaşırtan açıklama! Neredeyse 1 milyon TL…

NVIDIA CEO’sundan şaşırtan açıklama! Neredeyse 1 milyon TL…

NVIDIA, yapay zeka alanına yaptığı yatırımları artırmaya devam ediyor. Bu alanda hakim olmak isteyen şirket, geçtiğimiz günlerde Blackwell B200 yapay...

Spread the love <3
Mar
17
2024
9
OpenAI kendi yapay zeka donanımını yapmak istiyor!

OpenAI kendi yapay zeka donanımını yapmak istiyor!

Yapay zeka alanında önemli bir yere sahip olan OpenAI, kendi yapay zeka donanımını üretmek için adımlar atmaya hazırlanıyor. Şirket, özel...

Spread the love <3
Whatsapp İletişim
Merhaba,
Size nasıl yardımcı olabilirim ?