Cars —

Tesla’s no-good rotten couple of weeks see more fingers pointed at Autopilot

Two more Model X crashes see fingers pointed at Autopilot.

It has been a rough couple of weeks for Tesla. Until now, the electric vehicle maker has been a media doyenne, wowing us with EVs that are credible alternatives to the traditional combustion-powered car or SUV—even attractive finally to some drivers for whom not being able to go on a cross-country road trip at a moment's notice is a deal-breaker.

It all started at the end of June, when Tesla revealed in a blog post that the National Highway Traffic Safety Administration had begun an investigation into the company's Autopilot system following a fatal crash in Florida in May. Since then, the Detroit Free Press has reported on another pair of Tesla crashes—a Model X SUV that rolled over on the Pennsylvania Turnpike on July 1, followed by another Model X crash that took place on July 10—calling into doubt the safety of Autopilot. Unlike the May crash, neither of the subsequent incidents involved fatalities.

Tesla has said that it does not have data to support Autopilot being a factor in the July 1 crash, telling the Detroit Free Press in a statement that "We received an automated alert from this vehicle on July 1 indicating air bag deployment, but logs containing detailed information on the state of the vehicle controls at the time of the collision were never received. This is consistent with damage of the severity reported in the press, which can cause the antenna to fail."

Yet the most recent incident—first made public on the Tesla Motors Club forums—directly implicates the car's self-driving mode. According to the thread (posted by the owner's friend), the Model X was in Autopilot mode at 55 to 60mph (89 to 97km/h) at night when the car veered off the road and hit a series of wooden fence posts, tearing off the front passenger-side wheel in the process. (The Detroit Free Press also reports that the driver told the Montana Highway Patrol that he had activated Autopilot.) Tesla is believed to still be looking into this most recent crash.

To cap it all, we discovered yesterday that the Securities and Exchange Commission is also investigating Tesla to determine whether the company ought to have notified shareholders of the NHTSA investigation.

As we (and others) have noted, much of Tesla's Autopilot is actually made from off-the-shelf components, and plenty of other car companies will sell you a car capable of level-2 self-driving. So why the sudden spate of reports in the news about crashed Teslas and not wrecked Volvo XC90s, Audi Q7s, and so on?

For one thing, user perception of the different systems may well be a factor. Calling it Autopilot (even saying it's a beta) implies a different level of functionality versus something long-winded and anodyne—like Adaptive Cruise Control and Lane Keep Assist. Indeed, engineers working on piloted-driving technology for other OEMs have told Ars that the name "Autopilot" is misleading for a level-2 self-driving system. For despite the realities of aviation autopilot systems, most laypeople presume it means a fully autonomous, hands-off system that requires little attention from the human operator.

The implementation is subtly different, too. In every other level-2 self-driving car we have tested, 15 seconds is about all the hands-free time you'll get (with the exception of various traffic jam assists that disable once you're traveling faster than 37mph (60km/h). Not so Tesla. In a statement to Ars, the company told us that the permissible hands-free time interval is variable. "When we see lateral acceleration, tricky road conditions, or high curvature in a road, Teslas will increase those hands on checks. If there’s no response from the driver, we begin a rapid escalation of these alerts, which begin in the instrument cluster in front of the driver, and then audible chimes which mute any audio in the car. If the driver still doesn’t respond, we apply the flashers and safely bring vehicle to a stop," it said.

Indeed, anecdotes from owners suggest that at least four minutes of hands-off driving is possible, even before you try to fool the system with a rubber band or a wedged knee against the wheel. And the more time that Tesla's fleet of EVs spends in Autopilot, the more data Tesla has to draw on for future systems—even if the miles accumulated still have some ways to go before there is sufficient statistical rigor to demonstrate that Autopilot is safer than humans.

But despite Tesla's horrid fortnight, we don't believe there's a fundamental problem with the technology. Over at The Drive, Alex Roy (who has logged more miles in Autopilot than most of us) has dissected the fatal Florida crash, pointing the finger at the driver. And as we keep pointing out, no matter how good a level-2 self-driving car is, the human driver behind the wheel is the person licensed to operate it, and it remains that person's job to be alert and on the lookout. These assists are great at reducing driver fatigue and providing a safety net, but only when used properly.

Listing image by Jonathan Gitlin

Channel Ars Technica