According to literature and the entertainment world, we should be in flying cars and self-driven vehicles by now. We’re not for a number of reasons, with one of them being that the features such as Autopilot in Tesla vehicles are widely seen as unsafe. A Tesla executive defended the feature recently to two senators pushing to regulate the industry.
Letter from the Senators
There have been several crashes involving the Autopilot feature. An accident occurred last year in Spring, Texas, that killed two men. It was initially believed that no one was in the driver’s seat, despite Tesla owners being urged to keep their eyes on the road and stay behind the wheel.
However, data was released by the National Transportation Safety Board last fall that showed there was a person in the driver’s seat at the time of the crash who was buckled in. Data before that showed that person in the back seat. It was not able to determine why he moved up to the front seat and whether Autopilot was running.
A National Highway Traffic Safety Administration crash team investigated the crash as well, along with local law enforcement.
Senators Ed Markey and Richard Blumenthal sent a letter before this data was released to the NHTSA Acting Administrator Steven Cliff. The letter asked the agency to solve what happened in the Texas crash to “better inform” legislation concerning Tesla’s Autopilot feature and similar systems.
“We strongly urge you to conduct a complete investigation into Saturday’s fatal vehicle crash and develop recommendations for improving automated driving and driver assistance systems,” wrote Markey and Blumenthal. “We look forward to working with you and the HTSB to implement policy changes that stop these preventable deaths from occurring and save lives.”
It was reported that just before the crash, the wives of the two deceased men – Everette Talbot, 69, and William Varner, 59 – were discussing the Tesla’s autopilot feature. Initially, investigators were “100 percent certain” neither of the victims were in the driver’s seat when the vehicle crashed.
NHTSA admitted around the time of the crash that it had investigated 27 Tesla crashes, with 23 of them still open at the time. The agency has a reputation for being soft on the Autopilot feature. NTSB, however, has blamed Tesla in the past, particularly for the death of an Apple employee who didn’t appear to have his eyes on the road at the time of his crash in California.
After the crash, Tesla CEO Elon Musk tweeted that “data logs recovered so far” indicated that the Autopilot feature was not engaged at the time of the Texas crash. He also noted that the car owner had not purchased the “Full Self-Driving” option.
Tesla Response to Senators
Tesla senior director of public policy, Rohan Patel, responded to the letter by Senators Markey and Blumenthal – one year later. “Tesla’s Autopilot and FSD Capability features enhance the ability of our customes [sic] to drive safer than the average driver in the U.S.”
He went on to describe Autopilot and Full Self-Driving as Level 2 systems “which require the constant monitoring and attention of the driver,” and said the features are “capable of performing some but not all of the Dynamic Driving Tasks (DDT) that can be performed by human drivers.”
Yet, that’s the main crux of the issue, that it’s called “Full Self-Driving,” yet explained as something that requires monitoring and the driver’s attention.
Patel backed up his thoughts with the following stats: “For example, in the fourth quarter of 2021, Tesla recorded one crash for every 4.31 million miles driven in which our drivers were using Autopilot technology, compared to the NHTSA most recent data, which shows an automobile crash occurs every 484,000 miles.
Tesla is known to release data that reflects Patel’s quote. Experts don’t put much stock in those figures, as Autopilot is mostly used on highways, so it doesn’t really compare to national data that includes a wider variety of circumstances.
Additionally, safety experts and regulators have a history of imploring Tesla to feature driver monitoring more on its vehicles. Musk, meanwhile, has ignored the suggestions of the company’s engineers to add more driver monitoring, noting it’s “ineffective.” However, he has admitted the crashes are due to complacency.
“This is just more evasion and deflection from Tesla,” said the senators in a joint statement after Patel addressed them. “Despite its troubling safety track record and deadly crashes, the company seemingly wants to carry on with business as usual. It’s long past time Tesla got the message: follow the law and prioritize safety.”
Despite the words of Musk and Patel, the company has admitted to problems in the past. Just last fall, Tesla pulled an upgrade to its Full Self-Driving feature because of unknown issues. This was just days after it had been released as well as just a few days before the NTSB data showed a driver in the seat at the time of the Texas crash.
Our latest tutorials delivered straight to your inbox