The National Highway Traffic Safety Administration (NHTSA) has begun to investigate the safety of Tesla Inc.’s driver assistance software known as Autopilot. The cause for concern came out of 11 car accidents since 2018 in which the accident was caused by a Tesla that at the time was using the Autopilot feature.
NHTSA has been studying the Autopilot systems of over 700,000 vehicles ranging from 2014 to 2021 models. The study will attempt to determine whether the Autopilot systems are functioning safely and effectively while also trying to determine other variables that may have caused the feature to malfunction, such as “first responder vehicle lights, flares, an illuminated arrow board, and road cones.”
The NHTSA has had growing concerns over the safety of advanced driver assistance software and has begun requiring companies to begin reporting crashes in which an advanced driver assistance software was in use. Tesla has stated that drivers are safer with Autopilot activated as it helps with steering and keeping a safe distance from other vehicles on the road.
Some have attributed the accidents to the name Autopilot, believing that it gives a false sense of what the software is designed to do. The argument is that some users are believed to use Autopilot as a form of self-driving and think that the car no longer requires human intervention to control. Tesla has rejected all scrutiny from these claims as these actions go against company instructions.