A federal safety agency recently told car manufacturers to begin reporting and tracking crashes involving cars and trucks that use “autopilot” features. The move is a sign of the growing concern about the safety of semi-autonomous driving.  “Autopilot requires the driver to remain in the driver’s seat, pay attention to the road and traffic conditions and be prepared to intervene if a crash is imminent,” Alain L. Kornhauser, the director of the transportation program at Princeton University, told Lifewire in an email interview.  “It is not a ‘crash prevention’ device. It isn’t even an ‘Automated Emergency Braking’ device or system.” 

Tesla Crashes Under Scrutiny

According to the National Highway Traffic Safety Administration (NHTSA), the new federal rules require that automakers report serious crashes within one day of learning about them.  The agency defines serious as accidents in which a person is killed or taken to a hospital, a vehicle has to be towed away, or airbags are deployed. “NHTSA’s core mission is safety. By mandating crash reporting, the agency will have access to critical data that will help quickly identify safety issues that could emerge in these automated systems,” Steven Cliff, the head of the NHTSA, said in a news release. “In fact, gathering data will help instill public confidence that the federal government is closely overseeing the safety of automated vehicles.” The NHTSA recently said it was investigating 30 Tesla crashes involving 10 deaths since 2016 in which advanced driver assistance systems were suspected of being in use. But Kornhauser said that the Tesla Autopilot feature is “very” safe.  “As with any product, if it is not used properly, it may become unsafe,” he added.  “A ‘55 Chevy is unsafe if driven at speeds way over the speed limit or if you drive on the wrong side of the road.” The term “autopilot” that Tesla uses in its marketing may confuse drivers into thinking that they can take a hands-off approach. Bryant Walker Smith, a law professor at the University of South Carolina specializing in auto safety, said in an email interview with Lifewire.  “Like any driver assistance system, Tesla’s version works unless and until it doesn’t,” he added. “This is why driver vigilance is so important—and why many of us are particularly concerned about Tesla’s approach.” 

High Tech Safety Improvements

Kornhauser said that automakers could do things to make driving tech even safer than it already is. Improvements include boosting the “Automated Emergency Braking Systems” so that head-on collisions are reduced. Speed limiters could be installed that don’t allow excessive speeds. Manufacturers also could put devices in cars that prevent users from driving when their blood-alcohol level is above the legal limit. Using artificial intelligence is one way to make cars safer, Ian Ferguson, a vice president at Lynx Software Technologies, which provides safety and security solutions for automotive and other high-risk environments, said in an email interview. “When we start driving, we lack experience,” Ferguson said. “We make mistakes. With AI, a new car on the road is infused with hundreds of thousands of hours of experience, gathered from the data from millions of vehicles.” AI can help people get more comfortable driving autonomous vehicles, Ferguson said. Lynx in May conducted a survey, which found that many users are still nervous about autopilot. The study found that 80% of consumers trust human pilots over an autonomous one right now, with 65% citing the lack of testing as a roadblock to adopting self-driving technology.  But the biggest problem drivers face may be themselves.  “Distracted driving and other forms of irresponsible driving remain an enormous problem on our roads,” Smith said. “In the latest vehicles with advanced driver assistance systems, in the oldest vehicles without any of these features, and in all the vehicles in between.”