Waymo Retires Iconic “Firefly” Vehicles

Google’s self-driving car company Waymo is retiring its iconic “Firefly” self-driving vehicles from testing fleets after three years in service. The Firefly, which were widely known as “the koala cars,” are being replaced by Waymo’s expanding fleet of Chrysler Pacifica plug-in hybrid autonomous minivans. This transition comes as Waymo moves toward commercial availability, including an “early rider program” in Phoenix, Arizona.

[Continue Reading]

Does Autosteer Actually Deserve Credit For a 40% Reduction In Tesla Crashes?

Tesla’s high-flying image, which had been moving from strength to strength since early 2013, hit its biggest speed bump last year when its Autopilot semi-autonomous/Advanced Driver Assist System (ADAS) came under scrutiny in the wake of Joshua Brown’s death. Suddenly Tesla’s pioneering Autopilot system went from being one of the company’s key strengths to being a serious liability that raised troubling questions about the company’s safety culture. Tesla CEO Elon Musk tried to swat away these concerns with what proved to be a set of highly misleading statistics about Autopilot safety, but the issue was not laid to rest until NHTSA closed its investigation with a report that seemed to exonerate Autopilot as a safety risk. With a single sentence, NHTSA shut down the most dangerous PR problem in Tesla’s history:

The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

Because NHTSA is the federal authority on automotive safety, with unparalleled resources to assess and investigate safety risks, this single sentence effectively shut down public concerns about Autopilot’s safety. In a terse statement on its company blog, Tesla noted

we appreciate the thoroughness of NHTSA’s report and its conclusion

But how thorough was NHTSA’s investigation, and how accurate was its conclusion? As it turns out, the questions around Autopilot’s safety may not be as settled as Tesla and NHTSA would have you believe.

[Continue Reading]

CA DMV Report Sheds New Light On Misleading Tesla Autonomous Drive Video

On October 20th of last year Tesla Motors published an official blog post announcing an important development:

“as of today, all Tesla vehicles produced in our factory – including Model 3 – will have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.”

Tesla backed up this bold claim with a slick video, set to The Rolling Stones’ “Paint It Black,” which depicted one of the company’s Model X SUVs driving itself from a home in the Bay Area to the company’s headquarters near the Stanford University campus, apparently with no driver input. In a tweet linking to the video, Tesla’s CEO Elon Musk described this demonstration in no uncertain terms:

“Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot”

After months of negative news about Tesla’s Autopilot in the wake of a deadly crash that the system had failed to prevent, the video prompted a return to the fawning, uncritical media coverage that characterized the initial launch of Autopilot. And by advertising a new sensor suite that made all existing Teslas obsolete, the company was able to bolster demand for its cars even as it discontinued the discounts that had driven sales in the third quarter. Like so many of Tesla’s publicity stunts, the video was a masterpiece of viral marketing that drove the company’s image to new heights… but like so many of Tesla’s publicity stunts it also turns out to have been extremely misleading.

[Continue Reading]

Elon Take The Wheel

tesla-elon-musk-picture courtesy Forbes

If Tesla Motors has a single greatest asset, it’s not a factory or battery chemistry but the immense public trust that its CEO Elon Musk inspires. Faith in Musk’s abilities and good intentions underlies Tesla’s, passionate fan base, perceived technology leadership and high-flying valuation, and excuses its multitude of shortcomings in quality and customer service. Nothing exemplifies the power of this faith like Tesla’s ability to convince the public to trust its Autopilot system to navigate them through a landscape that kills more than 30,000 Americans each year. So as the number of Autopilot-related crashes begins to pile up and Tesla belatedly reveals that one of its customers died while using the system, it’s not surprising that faith in Musk and Tesla is taking a hit.

In my latest post at The Daily Beast, I teamed up with Nick Lum to investigate why so many Tesla owners appear to believe that Autopilot is more capable than it actually is and our findings are deeply troubling. From the very first announcement Musk and Tesla have misrepresented Autopilot’s capabilities in hopes of maintaining Tesla’s image as Silicon Valley’s most high-tech auto play in the face of Google’s far more serious autonomous drive program. Now, even after the first fatal crash, they are trying to maintain misperceptions of Autopilot’s capabilities by touting junk statistics that purport to demonstrate an Autopilot safety record that is superior to the average human driver. As Nick and I discovered, the deeply disingenuous nature of Tesla’s representations erode Tesla and Musk’s credibility on a fundamental level: either they do not understand the auto safety data or they are intentionally misleading the public. Either way, they refuse to acknowledge that either incompetence or deception has created a situation that has put the public at risk and continue to stand by safety claims that don’t hold up to even the slightest critical analysis.

As it turns out, there’s almost no end to the ways in which Tesla and Musk’s claims about Autopilot safety fall apart under scrutiny. In addition to the analysis presented in The Daily Beast, here are a few more ways in which to think critically about Tesla’s Autopilot claims.

[Continue Reading]

Toyota’s new autonomous drive czar worried about “those crazy cars driven by human beings.”

Pratt in Tokyo

Pratt in Tokyo

Who pays, or who possibly goes to jail when something happens is the biggest unsolved problem in the quest for the autonomous car, says a preeminent authority in the autonomous drive field. According to the expert, the infallible, 100 percent safe autonomous car is a fantasy. Autonomous drive will prevent a large number of accidents caused by inattentive humans, but we will have to come to grips with the fact that eventually, those robots driving our future cars will kill people – and then what? [Continue Reading]

Don’t Feed The “Ban Driving” Trolls

They see me trolling...

Cars and the people who love them have taken a bit of a trolling in the last week, as autonomous car car firms and the people who love them become increasingly convinced that a sea change is in the offing. The trolling began with a Buzzfeed article that told car fans to “go f*ck a tailpipe” if they think their love of driving outweighs the moral obligation to reduce the 1.2 million lives that are lost each year in cars, and things took off from there. The latest salvo, from Fusion, argues that driving should be made illegal within 15 years. Though self-driving cars are unquestionably the most consequential challenge to face cars in their more than hundred years as a cornerstone of modern society, the conversation around this massive opportunity needs to become a lot more pragmatic and constructive if we’re going to make the most of it.

[Continue Reading]