The initial reaction to the NHTSA letter sent to Tesla (NASDAQ: TSLA) on July 8 was that it was focused on a general investigation into Tesla's self-described Autopilot system. Since there are a huge number of people who are drinking the Tesla Kool-Aid and believe its statistics, the NHTSA letter seems to be regarded as a meaningless bureaucratic exercise that ultimately will have no negative conclusions affecting Tesla.
I have a different interpretation of the letter, however, which is based on the letter's third bullet point on its first page which is titled "Subject System." Within that bullet point, there is language further defining its focus on:
"All systems designed to provide Automatic Emergency Braking (AEB) for forward crash mitigation or avoidance functionality."
In my opinion, such a focus creates a much more significant set of issues for Tesla.
Additional Issues concerning crash mitigation or avoidance capabilities
As I have previously written in both other articles and in comments on other authors' articles, describing a feature in beta test as being an "autopilot" system is highly misleading, irresponsible, and dangerous.
Tesla has now exacerbated its irresponsibility by its recent statements that buried in the fine print somewhere were warnings that the system was in beta test, that drivers had to remain alert, that both hands still had to be on the wheel, etc., but even with such disclaimers, I believe that the typical Tesla driver who has purchased the Autopilot feature seems to have a very different impression of its supposed capabilities.
There have also already been a lot of apocryphal stories and videos about hands-free driving, drivers nodding off in rush hour traffic, daredevil stunts by Tesla drivers, etc. (including examples by the now deceased Joshua Brown), but there was a mention of another Tesla collision in an excellent article written by Paulo Santos today that I believe was a very representative impression of how the typical Tesla driver perceives Autopilot:
On April 26, Simpson was driving north from Los Angeles on I-5, cruising in autopilot mode. "All of a sudden the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn't brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car," she told Ars."
Although I certainly don't know the driver involved to know anything more about her than what I see in her account of the collision, my initial impression was that her view of the Autopilot capabilities was a sort of "Every Man's (or Every Woman's)" idea that a system that was so heavily promoted and publicized as a significant and prominent Safety System for their vehicle would indeed keep them safe.
Ironically, given that the overall Tesla narrative is also built on the mythical supposed Master of the Universe, Elon Musk, who can land rockets on barges and who can out-innovate the stodgy automobile industry, I would contend that Tesla actually has a higher burden of responsibility to ensure that any new feature on its vehicles has been thoroughly engineered, analyzed, and tested before being released for volume production (not beta tested by vehicle owners). Just as the myth of Elon Musk keeps the stock levitated, Tesla vehicle owners probably are overly comfortable and complacent that their vehicles will perform as advertised and promoted given that the supposedly brilliant Mr. Musk is leading the company.
But, in Tesla's never ending gibberish and dissembling spin that is put on every event, we again see that Tesla's initial conclusion was:
the vehicle logs show that its adaptive cruise control system is not to blame. Data points to Simpson hitting the brake pedal and deactivating autopilot and traffic aware cruise control, returning the car to manual control instantly."
Paulo has already pointed out other inconsistencies in that statement but I also have my own take on the supposed series of events.
At the point that the driver did finally depress the brake, which I do acknowledge does disable any subsequent automatic functions, Tesla seems to imply by its statement that the driver must have then taken her foot off the brake to result in the car not stopping!
So, either the Automatic Emergency Braking system did not properly activate in time to avoid the collision which then resulted in the driver suddenly attempting to stop the car herself - or that the driver attempted to engage the brake well before the collision but then let her foot off the brake to plow into the car in front of her! The second scenario makes no sense at all and so I have to conclude that there was some other failure in Tesla's overall systems for both detecting another vehicle in front of a Tesla with Autopilot and to measure the distance between the Tesla vehicle and the vehicle in front.
Aside from the collision described above where the driver was at least apparently unharmed, everyone also already knows about the horrific crash on May 7 where another Tesla Model S couldn't even see a full-length tractor trailer in its path. Although others might still say that two collisions/crashes may not be statistically significant (or "material"), my perspective is that both crashes were proven failures of the Automatic Emergency Braking systems on both vehicles.
AEB systems are complex systems, however, and rely on a lot more things than just the brakes (which is another reason why such systems should not be deployed in beta test by random vehicle owners). The primary functionality is from a vehicle's detection systems which can be as simple as image sensors (definitely not sufficient) or as robust with redundancy such as a combination of radar, LIDAR, and image sensors. The problem with the current Tesla vehicles where Autopilot is installed is that only image sensors and radar are installed - and LIDAR sensors would not be able to be retrofitted into the vehicles.
In a Fortune article today, there was also a quote from an employee at Robert Bosch (the very large and technologically advanced global auto components supplier) that Bosch had been working on self-driving components and systems for more than 15 years and that "Bosch sees the necessity for a sensor set-up that includes radar, video, and LIDAR" as the three sensors "complement each other very efficiently."
Only Two out of Three is apparently bad…
Tesla isn't willing to accept such industry perspectives and would prefer to rush a revenue generating feature to market that is an incomplete system in the eyes of other automotive companies and prominent component suppliers. Not only is Autopilot an incomplete system that is unable to be retrofitted with an additional set of LIDAR sensors, but one death and another collision has now shown that there are instances where forward vehicle detection is not working at all.
If forward vehicle detection has instances where it does not work on Tesla vehicles, then each vehicle's AEB systems will also not work to keep a driver safe. Unfortunately, Ms. Simpson who is described above seemed to have the impression that her vehicle's systems would keep her safe - although the current Tesla system is missing a critical capability in the view of other automotive systems experts.
Since the NHTSA's letter directly identifies that it will be investigating the performance of Tesla's AEB systems, any reasonable investigation would probably conclude that an AEB system without LIDAR will have instances where it will not work safely. There is also probably faulty logic in the processing routines within the current code that is attempting to only rely on image sensors and radar to make up for the missing LIDAR component and so that would also be another issue that would require a lengthy amount of research, analysis, and testing. As highlighted in my article "And it didn't stop after contact," such an analysis and testing process would probably take at least six months.
I have no expertise or visibility into the process or length of an NHTSA ODI analysis, but as I've described above, if the focus is on the performance of Tesla's AEB systems that only use two of the three critical sensor systems that are regarded as necessary for comprehensive sensor function and redundancy, then the current Tesla vehicles using the current Autopilot are not fully capable of having AEB capability. As such, a significant part of the supposed Autopilot capabilities would need to be disabled.
If the NHTSA does determine that Autopilot needs to be disabled until Tesla vehicles do include all three sensors (optical, radar, and LIDAR) that other industry experts conclude are required, then a highly publicized feature and supposed technological advantage on previously sold Tesla vehicles would no longer exist. Aside from the small detail that $60 million in revenues has already been collected in selling such a "beta test" feature to, as another SA author put it, "lab rats" for Tesla's development purposes, the premature introduction of Autopilot would be another data point that the Tesla narrative of innovation and success in all of its activities is definitely not valid.
Since the stock price stays levitated based on such a misleading narrative, each future data point that disproves such a narrative is an unappreciated risk for Tesla investors.
Disclosure: I am/we are short TSLA.
I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.