HACKINTOSH.ORG | Macintosh discussion forums

Macintosh News => Apple News => Topic started by: HCK on April 30, 2024, 04:05:09 pm



Title: NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths
Post by: HCK on April 30, 2024, 04:05:09 pm
NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths

<p>The National Highway Traffic Safety Administration (NHTSA) <a data-i13n="cpos:1;pos:1" href="https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf">has concluded an investigation[/url] into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal incidents that led to 14 deaths. The organization has ruled that these accidents were due to driver misuse of the system.</p>
<p>However, the NHTSA also found that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” In other words, the software didn’t prioritize driver attentiveness. Riders using Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged,” because Tesla “did not adequately ensure that drivers maintained their attention on the driving task.&quot;&nbsp;</p>
<span id="end-legacy-contents"></span><p>The organization investigated nearly 1,000 crashes from January of 2018 until August of 2023, accounting for 29 total deaths. The NHTSA found that there was “insufficient data to make an assessment” for around half (489) of these crashes. In some incidents, the other party was at fault or the Tesla drivers weren’t using the Autopilot system.</p>
<p>The most serious were 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path” and these were often linked to Autopilot or FSD. These incidents led to 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react, but didn’t, in 78 of these incidents. These drivers failed to brake or steer to avoid the hazard, despite having at least five seconds to make a move.</p>
<p>That’s where complaints against the software come into play. The NHTSA says that drivers would simply become too complacent, assuming that the system would handle any hazards. When it came time to react, it was too late. “Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” the organization wrote. The imbalance between driver expectation and the operating capabilities of Autopilot resulted in a “critical safety gap” that led to “foreseeable misuse and avoidable crashes.”</p>
<p>The NHTSA also took umbrage with the branding of Autopilot, calling it misleading and suggesting that it lets drivers assume the software has total control. To that end, rival companies tend to use branding with words like “driver assist.” Autopilot indicates, well, an autonomous pilot. California’s attorney general and the state’s Department of Motor Vehicles <a data-i13n="cpos:2;pos:1" href="https://www.engadget.com/california-dmv-accuses-tesla-false-advertising-130350292.html">are also investigating Tesla[/url] for misleading branding and marketing.</p>
<p>Tesla, on the other hand, says that it warns customers that they need to pay attention while using Autopilot and FSD, <a data-i13n="cpos:3;pos:1" href="https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death">according to The Verge[/url]. The company says the software features regular indicators that remind drivers to keep their hands on the wheels and eyes on the road. The NHTSA and other safety groups have said that these warnings do not go far enough and were “insufficient to prevent misuse.” Despite these statements by safety groups, CEO Elon Musk recently promised that the <a data-i13n="cpos:4;pos:1" href="https://sg.finance.yahoo.com/news/tesla-earnings-week-spotlights-price-180809568.html">company will continue to go[/url] “balls to the wall for autonomy.”</p>
<p>The findings could only represent a small fraction of the actual number of crashes and accidents related to Autopilot and FSD. The NHTSA indicated that “gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes.” This means that Tesla only receives data from certain types of crashes, with the NHTSA claiming the company collects data on around 18 percent of crashes reported to police.</p>
<p>With all of this mind, the organization has <a data-i13n="cpos:5;pos:1" href="https://static.nhtsa.gov/odi/inv/2024/INOA-RQ24009-12046.pdf">opened up another probe[/url] into Tesla. This one looks into a recent OTA software fix issued in December after <a data-i13n="cpos:6;pos:1" href="https://www.engadget.com/tesla-recalls-2-million-cars-in-order-to-fix-autopilot-safety-controls-123308343.html">two million vehicles were recalled[/url]. The NHTSA will evaluate whether the Autopilot recall fix that Tesla implemented is effective enough.</p>This article originally appeared on Engadget at https://www.engadget.com/nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746.html?src=rss

Source: NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths (https://www.engadget.com/nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746.html?src=rss)