The federal investigation released stated that in March 2023, a student from North Carolina was getting off a school bus when he got hit by a Tesla Model Y that was moving at “highway speeds.” Elon Musk maintains that Autopilot, the company’s cutting-edge driver-assist technology, would someday result in completely autonomous vehicles. The Tesla driver was utilizing it.
The injured youngster, 17, suffered potentially fatal injuries and was flown by helicopter to a hospital. However, after looking into hundreds of comparable collisions, the study discovered a pattern of driver inattention together with technological flaws in Tesla that led to hundreds of injuries and dozens of deaths.
The National Highway Traffic Safety Administration (NHTSA) found that drivers who used Full Self-Driving or Autopilot, the system’s more sophisticated level, “were not sufficiently engaged in the driving task,” and that Tesla’s technology “did not adequately ensure that drivers maintained their attention on the driving task.”
The NHTSA looked into 956 crashes in total, which happened between January 2018 and August 2023. 29 persons were recorded to have lost their lives in those collisions, some of which included other cars colliding with the Tesla. Additionally, “the frontal plane of the Tesla struck a vehicle or obstacle in its path” occurred in 211 crashes. These were frequently the worst crashes, with 14 people killed and 49 injured.
Following many instances of Tesla drivers colliding with stationary emergency vehicles parked on the side of the road, the NHTSA was forced to begin its investigation. Due to the software’s disregard for scene management tools including flares, warning lights, cones, and lit arrow boards, the majority of these incidents occurred after dark.
The agency discovered in its study that Autopilot and, occasionally, FSD, were not intended to maintain the driver’s focus while operating a vehicle. Tesla claims to alert its users to the requirement for concentration when utilizing Autopilot and FSD, emphasizing the importance of having hands on the wheel and eyes on the road. Yet, the NHTSA claims that in many instances, drivers would lose concentration and become unduly comfortable. And when it came time to react, it was often too late and ultimately resulted in a crash.
The NHTSA determined that in 59 crashes it had investigated, Tesla drivers had “five or more seconds” to respond before a crash into another object happened. A minimum of 10 seconds before the collision, the hazard was obvious in 19 of those incidents. After reviewing incident reports and information from Tesla, the NHTSA discovered that most of the crashes it examined involved drivers who did not steer or brake in time to avoid the danger. “All Tesla hardware versions and crash circumstances were found to have crashed with no or late evasive action attempted by the driver,” the NHTSA stated.
The NHTSA also examined Tesla’s Level 2 (L2) automation features with those seen in automobiles made by other manufacturers. In comparison to other systems, Autopilot would deactivate instead of letting drivers adjust their steering. According to NHTSA, this “discourages” drivers from continuing to be active in the process of driving.
The agency stated that Tesla was an industry outlier in its approach to L2 technology because it mismatched a weak driver engagement system with Autopilot’s permissive operating capabilities. This was determined by comparing Tesla’s design choices to those of L2 rivals.
The attorney general of California and the state’s Department of Motor Vehicles are both looking into Tesla for misleading branding and marketing. The NHTSA acknowledged that its investigation may be incomplete based on “gaps” in Tesla’s telemetry data, which could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find. The agency claims that even the brand name “Autopilot” is misleading, conjuring up the idea that drivers are not in control. While other companies use some variation of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are.
In response to the study, Tesla released a voluntary recall at the end of last year and updated its software over-the-air to include additional warnings for Autopilot. After several safety experts claimed that the upgrade was insufficient and still permitted misuse, the NHTSA announced today that it was opening a new inquiry into the recall.
The results contradict Musk’s assertions that Tesla is an artificial intelligence startup preparing to launch a completely driverless car for individual usage. Later this year, the company intends to showcase a robotaxi, which is meant to mark the beginning of a new chapter in Tesla’s history. Musk said during this week’s first-quarter earnings call that his cars were safer than those operated by humans.
Musk stated, “I think it’s difficult to ignore if you have, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car.” “Because stopping autonomy at that point means killing people.”
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.