Does Autosteer Actually Deserve Credit For a 40% Reduction In Tesla Crashes?

Tesla’s high-flying image, which had been moving from strength to strength since early 2013, hit its biggest speed bump last year when its Autopilot semi-autonomous/Advanced Driver Assist System (ADAS) came under scrutiny in the wake of Joshua Brown’s death. Suddenly Tesla’s pioneering Autopilot system went from being one of the company’s key strengths to being a serious liability that raised troubling questions about the company’s safety culture. Tesla CEO Elon Musk tried to swat away these concerns with what proved to be a set of highly misleading statistics about Autopilot safety, but the issue was not laid to rest until NHTSA closed its investigation with a report that seemed to exonerate Autopilot as a safety risk. With a single sentence, NHTSA shut down the most dangerous PR problem in Tesla’s history:

The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

Because NHTSA is the federal authority on automotive safety, with unparalleled resources to assess and investigate safety risks, this single sentence effectively shut down public concerns about Autopilot’s safety. In a terse statement on its company blog, Tesla noted

we appreciate the thoroughness of NHTSA’s report and its conclusion

But how thorough was NHTSA’s investigation, and how accurate was its conclusion? As it turns out, the questions around Autopilot’s safety may not be as settled as Tesla and NHTSA would have you believe.

[Continue Reading]

CA DMV Report Sheds New Light On Misleading Tesla Autonomous Drive Video

On October 20th of last year Tesla Motors published an official blog post announcing an important development:

“as of today, all Tesla vehicles produced in our factory – including Model 3 – will have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”

Tesla backed up this bold claim with a slick video, set to The Rolling Stones’ “Paint It Black,” which depicted one of the company’s Model X SUVs driving itself from a home in the Bay Area to the company’s headquarters near the Stanford University campus, apparently with no driver input. In a tweet linking to the video, Tesla’s CEO Elon Musk described this demonstration in no uncertain terms:

“Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot”

After months of negative news about Tesla’s Autopilot in the wake of a deadly crash that the system had failed to prevent, the video prompted a return to the fawning, uncritical media coverage that characterized the initial launch of Autopilot. And by advertising a new sensor suite that made all existing Teslas obsolete, the company was able to bolster demand for its cars even as it discontinued the discounts that had driven sales in the third quarter. Like so many of Tesla’s publicity stunts, the video was a masterpiece of viral marketing that drove the company’s image to new heights… but like so many of Tesla’s publicity stunts it also turns out to have been extremely misleading.

[Continue Reading]

How Tesla Tries To Keep The Media On Autopilot


Today I appeared on Bloomberg television to discuss Tesla’s latest earnings, as I have after the electric car maker’s last few quarterly reports, but this time things were somewhat different. Minutes before we went live, the show’s host Emily Chang told me that she would be asking me about a correction that Tesla had requested to my most recent Bloomberg View post about the new Autopilot 2.0 hardware suite announcement. My initial draft had said that “several” people had died in Teslas with Autopilot enabled, and at the request of a Tesla representative my editors and I agreed to clarify that only two deaths were tied to the controversial driver assist system. I am always happy to make factual corrections to my writing, but because I had limited time to explain the complex circumstances around this particular issue I thought I would write a post laying out the particulars of this case.

[Continue Reading]

PSA: Google Translate sucks at Tesla accident analysis


Last Wednesday, a Tesla Model S rear-ended a bus on the German Autobahn. Usually, no-one would take note, but because it was a Tesla, the crash created, and still creates headlines. Naturally, most original news report are in German. The urge to use Google Translate is great. Pro tip: Don’t do it. Staying away from doing armchair accident analysis via Google Translate will help you avoid stupid mistakes such as that of, which, after running a German police report through the online translator, came to the following misguided conclusion:

“But, basically, it appears that a bus simply sideswiped a car, while the latter happened to be operating in a semi-autonomous capacity.“

Not true at all. As a public service, the Dailykanban provides a true translation of the police report. See yourself whether the bus sideswiped the Model S. [Continue Reading]

Elon Take The Wheel

tesla-elon-musk-picture courtesy Forbes

If Tesla Motors has a single greatest asset, it’s not a factory or battery chemistry but the immense public trust that its CEO Elon Musk inspires. Faith in Musk’s abilities and good intentions underlies Tesla’s, passionate fan base, perceived technology leadership and high-flying valuation, and excuses its multitude of shortcomings in quality and customer service. Nothing exemplifies the power of this faith like Tesla’s ability to convince the public to trust its Autopilot system to navigate them through a landscape that kills more than 30,000 Americans each year. So as the number of Autopilot-related crashes begins to pile up and Tesla belatedly reveals that one of its customers died while using the system, it’s not surprising that faith in Musk and Tesla is taking a hit.

In my latest post at The Daily Beast, I teamed up with Nick Lum to investigate why so many Tesla owners appear to believe that Autopilot is more capable than it actually is and our findings are deeply troubling. From the very first announcement Musk and Tesla have misrepresented Autopilot’s capabilities in hopes of maintaining Tesla’s image as Silicon Valley’s most high-tech auto play in the face of Google’s far more serious autonomous drive program. Now, even after the first fatal crash, they are trying to maintain misperceptions of Autopilot’s capabilities by touting junk statistics that purport to demonstrate an Autopilot safety record that is superior to the average human driver. As Nick and I discovered, the deeply disingenuous nature of Tesla’s representations erode Tesla and Musk’s credibility on a fundamental level: either they do not understand the auto safety data or they are intentionally misleading the public. Either way, they refuse to acknowledge that either incompetence or deception has created a situation that has put the public at risk and continue to stand by safety claims that don’t hold up to even the slightest critical analysis.

As it turns out, there’s almost no end to the ways in which Tesla and Musk’s claims about Autopilot safety fall apart under scrutiny. In addition to the analysis presented in The Daily Beast, here are a few more ways in which to think critically about Tesla’s Autopilot claims.

[Continue Reading]

Tesla’s game-changing, lane-changing Autopilot? “Currently, there is no radar on the market that can achieve that,” Toyota engineer says

1978 sensor technology, now in Model D

1978 sensor technology, now in the Model D

Two months ago, Tesla launched its Model D. Not quite a new car, not even a refresh, but the webs went wild. The Model D sports an extra electric motor, and a sensor package. Any other automaker, and such non-news would not even elicit a yawn from the press, save for a few snarky blogs that would torture the maker for not delivering the software for the hardware. Come to think of it, no automaker would dare to deliver hardware sans software, for fear of getting their derrieres handed to them. Tesla is unlike any automaker. As a Silicon Valley company, Tesla has marketing rights to vapor ware. The Model D was feted like the second coming of the Model T, and it was pronounced as equally, if not more disruptive to the industry than the mass-produced Ford.

Musk’s acolytes expected a self-driving car from Tesla, and they were given what they wanted to hear: The Autopilot. The official feature-list of The Autopilot is surprisingly feature-less. Currently, it offers exactly nothing, except for some “exciting long-term possibilities. Imagine having …” Then, a list of imaginary stuff follows that would have made Ford/Microsoft’s derided Sync system look worse. The listed exciting long-term possibilities, even those, are completely devoid of anything even remotely autopiloting.

The media came to Musk’s assistance, and made the missing Autopilot up. The members of the gadget press quickly reduced The Autopilot to a “smart lane-change system which will automatically move across a lane when you hit the blinker,” as Slashgear wrote. Not quite an Autopilot, but hey, good enough.

Trouble is: You won’t find the game changing lane-changer in Tesla’s Model D description, not even under “long-term possibilities.” Smart move on Tesla’s part, because it’s not going to happen. The hardware package renders the presumptive Autopilot blind.

[Continue Reading]