If Tesla Motors has a single greatest asset, it’s not a factory or battery chemistry but the immense public trust that its CEO Elon Musk inspires. Faith in Musk’s abilities and good intentions underlies Tesla’s, passionate fan base, perceived technology leadership and high-flying valuation, and excuses its multitude of shortcomings in quality and customer service. Nothing exemplifies the power of this faith like Tesla’s ability to convince the public to trust its Autopilot system to navigate them through a landscape that kills more than 30,000 Americans each year. So as the number of Autopilot-related crashes begins to pile up and Tesla belatedly reveals that one of its customers died while using the system, it’s not surprising that faith in Musk and Tesla is taking a hit.
In my latest post at The Daily Beast, I teamed up with Nick Lum to investigate why so many Tesla owners appear to believe that Autopilot is more capable than it actually is and our findings are deeply troubling. From the very first announcement Musk and Tesla have misrepresented Autopilot’s capabilities in hopes of maintaining Tesla’s image as Silicon Valley’s most high-tech auto play in the face of Google’s far more serious autonomous drive program. Now, even after the first fatal crash, they are trying to maintain misperceptions of Autopilot’s capabilities by touting junk statistics that purport to demonstrate an Autopilot safety record that is superior to the average human driver. As Nick and I discovered, the deeply disingenuous nature of Tesla’s representations erode Tesla and Musk’s credibility on a fundamental level: either they do not understand the auto safety data or they are intentionally misleading the public. Either way, they refuse to acknowledge that either incompetence or deception has created a situation that has put the public at risk and continue to stand by safety claims that don’t hold up to even the slightest critical analysis.
As it turns out, there’s almost no end to the ways in which Tesla and Musk’s claims about Autopilot safety fall apart under scrutiny. In addition to the analysis presented in The Daily Beast, here are a few more ways in which to think critically about Tesla’s Autopilot claims.
Two weeks after the May 7 crash that killed Joshua Brown and just days after its last stock offering, Tesla’s director of Autopilot programs Sterling Anderson claimed that Tesla customers had driven “~100 million miles” with Autopilot activated since the controversial “Autosteer” function had been rolled out in firmware update 7.0. Anderson made no mention of the fact that one of Tesla’s customers had died while Autopilot was activated, even though it had already notified NHTSA of the fatal Brown crash. This lack of disclosure is troubling even though it took place after Tesla’s stock sale and thus doesn’t necessarily bear on the SEC’s ongoing investigation. In light of what we now know, Anderson’s presentation raises a number of deeply concerning questions about Tesla’s Autopilot safety claims.
After Brown’s death became public, more than a month after Anderson’s presentation, Tesla claimed in a blog post that its customers had driven 130 million miles with Autopilot activated and argued that this record compares favorably to the American driver’s average of one fatality every 94 miles. But if Brown died two weeks before Anderson claimed Tesla owners had driven 100 million miles, the reality is that the first Autopilot death occurred somewhere much closer to the 94 million mile human driver average. The mileage at which the fatal crash actually occurred –not the mileage at which Tesla finally disclosed that the fatal crash had occurred– is clearly the more accurate representation of Autopilot’s safety record.
Only Tesla has the statistics to prove how many miles Autopilot had driven when Brown died, but using the numbers that Anderson gave it’s possible to calculate the range of possibilities. At the high end, it seems clear that Autopilot had not driven more than 100 million miles since Anderson claimed that number two weeks after the crash. At the low end, it seems possible that Brown’s death occurred well below the American average of 94 million miles per fatality. Brown claimed Tesla was bringing in 2.6 million miles of per day, which indicates that Brown’s death 17 days earlier may have taken place at just 66 million miles driven by Autopilot. Of course this low-end estimate assumes that the rate of data collection was constant, which was likely not the case, and Anderson’s ambiguous wording raises questions about the kind of data collection he was referring to in the 2.6 million mile per day claim. Still, the limited data that Tesla has presented indicates that Brown’s death took place somewhere well below 94 million miles driven, calling Tesla’s claim that Autopilot is safer than the average human driver into question.
Anderson also revealed in the same speech that the entire category of “customer miles driven with Autopilot activated” may not represent what the public thinks it does. Here’s what he said:
“In the last 7 months, since the October release of Autosteer, 100 million miles have been driven by our customers with Autopilot, which includes Traffic Aware Cruise Control or Autosteer.” [emphasis added]
This one word –“or”– is a critical revelation that casts light on the heart of Tesla’s deceptive communication about Autopilot. Ands erson measures “Autopilot activated” miles starting with the release of Autosteer, which was the moment in which Autopilot crashed into the public consciousness in a flood of news items and videos of Tesla drivers waving their hands while the car steered itself. For the purposes of public discourse, Autosteer is Autopilot. But as Anderson clarifies, Autopilot is actually two systems: the controversial Autosteer system sits on top of a Traffic Aware Cruise Control system (TACC) that was deployed to Tesla vehicles long before Autosteer. And when Tesla claims that its customers have driven 130 million miles “with Autopilot activated,” it is referring to mileage in which either TACC or Autosteer was activated.
This is hugely important because TACC is not a particularly controversial system, and is definitely not what the public thinks of when it thinks of “Autopilot.” TACC systems, which augments cruise control by maintaining safe distances and matching speeds based on surrounding traffic, have been around since Mitsubishi first offered a LIDAR-based version in the Japanese market in 1992. TACC systems have been a mainstay of the luxury car market for decades, and are now pushing down into such non-premium vehicles as the Subaru Impreza. Nobody thinks of TACC as being an “autonomous” system, fitting instead into the well-established category of Advanced Driver Assistance Systems (ADAS).
If Tesla is counting miles driven with just TACC activated as “Autopilot” miles, lumped in with miles driven with both TACC and the more controversial Autosteer function activated, it is once again exploiting the public’s poor understanding of what Autopilot is to create a misperception of its safety record. Tesla should clarify what percentage of its various Autopilot mileage claims were driven with just TACC activated and what percentage were driven with “full Autopilot” (TACC and Autosteer) activated. It seems entirely likely that a significant percentage of these miles were driven with just TACC, which would further erode Autosteer’s safety record below the human-driven average. In order to actually address the controversy that threatens to engulf it, Tesla should clarify exactly how safe Autosteer is by stating how many miles customers had driven with Autosteer activated at the time that Brown died. This is the answer the public is looking for, and it’s the only way to begin clarifying how safe the controversial aspect of Autopilot really is compared to human drivers.
If this answer ever comes, it’s highly unlikely that it will show that the safety record for “full Autopilot” is as safe as the US human driver average, let alone as safe as Tesla has claimed. Given Tesla’s record of defensiveness when the media calls it out on its misrepresentations and how committed it seems to its misleading claims about Autopilot, it’s difficult to imagine Tesla making this important –but damaging– clarification. But if Musk and Tesla truly believe in the benefits of autonomous technology, they have to recognize that its acceptance will be a long-term effort that can not simply be pushed through with cool videos and bad statistics. Creating public trust for Autopilot and autonomous drive technology more broadly requires accepting the public’s legitimate concerns and consistently communicating accurate information in response to them. By creating and perpetuating misleading perceptions of Autopilot, Musk and Tesla are hurting the technology they claim to be championing as well as their all-important credibility. It’s in everyone’s interest –Musk’s, Tesla’s, the public’s– that the gap between Autopilot’s perceptions and reality be closed.