Gordon Johnson: NHTSA Autopilot Investigation Covers 765,000 Vehicles, Sentiment Could Turn “Decidedly Downward”

Fight Censorship, Share This Post!

Gordon Johnson: NHTSA Autopilot Investigation Covers 765,000 Vehicles, Sentiment Could Turn “Decidedly Downward”

Earlier in the morning we noted that the NHTSA was opening what appeared to be an extremely broad-reaching formal investigation into Tesla’s Autopilot – an investigation that reportedly will span all of Tesla’s major models from 2014 to 2021.

Shortly after the announcement, GLJ Research’s Gordon Johnson released a note claiming that the investigation could result in sentiment turning “decidedly downward” if sell-side peers begin to understand that Tesla’s proclaimed autonomy prospects are nowhere near what many think.

Johnson’s note opens by describing the situation and noting the investigation covers 765,000 vehicles: “The U.S. government has opened a formal investigation into TSLA’s Autopilot partially automated driving system, saying it has trouble spotting parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that TSLA has sold in the U.S. since the start of the 2014 model year.” 

He continues: “The agency says it has identified 11 crashes since 2018 in which TSLA cars on Autopilot or Traffic Aware Cruise Control have hit vehicles with flashing lights, flares, an illuminated arrow board or cones warning of hazards. The investigation covers TSLA’s entire current model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years. Autopilot has frequently been misused by TSLA drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.

Johnson also thinks the investigation is noteworthy because it focuses on people outside of Tesla vehicles for the first time. He writes:

Importantly, in the NHTSA’s statement we note they are zeroing in on a particular danger that TSLA creates for people OUTSIDE the vehicle – i.e., those who never agreed to be Autopilot “guinea pigs”. Thus, to simply say “TSLA drivers accept Autopilot’s risks”, as has been used in the past, does not appear to be a defense here.

Furthermore, given the NHTSA is investigating how TSLA “monitors, assists, and enforces” driver engagement, we see this investigation as a potential big problem for TSLA. More specifically, it is our opinion that TSLA’s driver assist software does not monitor driver awareness nearly as well as other available systems (creating “predictable abuse” according to one attorney we spoke to this morning).

Yet, in our opinion, the lack of driver awareness is what most customers like so much about TSLA’s Autopilot vs. other systems; thus, if the lack of engagement in TSLA’s lane assist is taken away, disallowing TSLA drivers to “get into the back seat while the car is driving”, it would become just another lane assist feature in real life (our opinion), likely reducing its draw from consumers.”

The announcement comes days ahead of Tesla’s “AI Day” event and – more importantly, “shortly after noted TSLA “full-self-drive” critic Jennifer Homendy was appointed to head the NTSB,” he says. 

Johnson spoke to two lawyers about the investigation this morning and relayed their thoughts:

“…in our discussions with two attorneys this morning with knowledge of the matter, we see problems here for TSLA. More specifically, we believe TSLA’s biggest weakness is its weak measures (which have been alleged to be intentional) to make sure the driver is paying attention (which is EXACTLY what the NHTSA is investigating) – this is the opinion of both lawyers we spoke to this morning. More specifically, in the words of one lawyer: “Whether the system is weak at detecting emergency vehicles compared to other “FSD” systems or just insufficient overall, it shouldn’t be used at all if the system is not taking appropriate steps to make sure the driver is paying attention”.

He concludes by making the case that this investigation could be the first of many dominoes to fall that could reveal Tesla’s autonomy prospects to be far worse than many have believed: “In short, when considering: (a) around 4/22/19 TSLA said there would be 1mn robotaxis on the road in 2020, ahead of a significant capital raise – as of the writing of this report, there is not 1 robotaxi on the road, (b) TSLA’s claim, in 2016, that all of its cars, henceforth, would have the hardware necessary for FSD – since making that statement, TSLA updated its hardware to HW3, meaning this statement was false, yet (c) TSLA has taken in hundreds of millions of dollars in “FSD” revenue, or monies for vaporware that does not exist, we see potential problems for TSLA on the horizon given these issues.”

Johnson says it seems that Elon Musk’s “comments made 1/27/21 centering on the idea that TSLA will be at Level 5 autonomy by the end of 2021, vs. the official statements provided to California’s DMV 12/4/20 that the final version of what TSLA has released will be SAE Level 2, could be seen as misleading.”

He concludes: “…in short, this could open TSLA up to liability for those who purchased the company’s FSD option by relying, solely, on E. Musk’s claims (our opinion). Furthermore, in light of this information, should our sell-side peers (finally) read TSLA’s communication with California’s DMV and conclude the jump from Level 2 to Level 5 is not possible in 2021 (or potentially 2022), sentiment on TSLA could turn decidedly downward as many begin to question the veracity of E. Musk’s FSD/autonomy claims.”

Tyler Durden
Mon, 08/16/2021 – 12:51


Fight Censorship, Share This Post!

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.