Tesla Autopilot Program Flickers Behind A Shield Of Plausible Deniability

| About: Tesla Motors (TSLA)

Summary

Tesla appears to be on a dangerous path of hiding material information for the sake of plausible deniability.

While “beyond reasonable doubt” is a standard to meet in criminal cases, in civil cases, the burden of proof is far less rigorous.

Courts across the US, and may be some internationally, will soon be finding that “preponderance of evidence” indicates Autopilot is to blame.

A few weeks back there was much furore about an Autopilot fatality involving Tesla Motors' (NASDAQ:TSLA) Model S car. Mounting evidence at that time suggested that Tesla's Autopilot was broken and unlikely to ever get past the beta stage as the system appeared fundamentally limited in what it could do.

However, overwhelming data that shows the feature to be not working was not sufficient for Tesla to do a course correction. As is apt for this "green" technology company, Tesla came up with statistics that purportedly prove that Autopilot statistically improves safety.

The troubling part is that anyone with even a modicum of statistical background would have known that Tesla's data was not statistically significant and what it presented was a garbage argument.

Tesla Chairman Elon Musk, however, was on a mission to save the world through Autopilot and could care less about a death or two on the way. What if a few are dead now as long as we can save a lot more later was the profound logic that pervaded Mr. Musk's pitch for continued Autopilot deployment. Armed with bogus statistics, and as a dim witted genius is wont to do, Mr. Musk went on a PR offensive contending how safe Tesla was with an Autopilot.

Why do we dig up this well-trodden past?

Information uncovered in the recent past makes it appear that Mr. Musk was not just presenting bad statistics but his bad statistics were based on the legal concept of "plausible deniability." We believe it is imperative that Tesla investors understand the background as well as the legal concept.

Beginning with square one, let's revisit the Tesla blog that announced the Florida fatality. Note this specific language:

"This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles."

"The first known fatality….where Autopilot was involved" is an interesting choice of words. Was this written by an attorney?

Note that there were only about 6 fatalities involving Tesla vehicles when this blog entry was written (Some would argue 6 fatalities is far too high for a car in this class, considering many models in this segment ship an order of magnitude of cars and have far fewer fatalities; however, that is a discussion for a later date).

At that time, we questioned: "Would Tesla have presented this data if the number was less than 94 million miles? What would happen if there is a second or third fatality soon? Would Tesla publish the same statistic then too?"

It turns out there is a reason for the specific language chosen by Tesla. As we found out about a week back, there was a Model S driver fatality in China in January of this year. The car's buyer has reported to Tesla and the Chinese media this was an Autopilot failure.

This was well before the Florida fatality. Why did Tesla not disclose this?

It appears that Tesla is clawing its tentacles into a legal concept called "plausible deniability." Tesla's position on this fatality, as can be seen from the linked article, is as follows:

"We take any incident with our vehicles very seriously and immediately reached out to our customer when we learned of the crash. Because of the damage caused by the collision, the car was physically incapable of transmitting log data to our servers and we therefore have no way of knowing whether or not Autopilot was engaged at the time of the crash. We have tried repeatedly to work with our customer to investigate the cause of the crash, but he has not provided us with any additional information that would allow us to do so."

In other words, Tesla is blaming the customer for not providing access to collect the necessary data.

Unfortunately for Tesla, Reuters reports that the customer's attorney denies Tesla's claim. According to Reuters:

"The lawyer said Gao disputed the claim.

"The car is still there, and the data can still be extracted. A consumer can't read the data, but Tesla could read the data," he said."

Fullness of time will reveal if there is any merit to Tesla's claims but let's set that aside for now and look at the accident itself.

According to the video released by the deceased's parent (the second video in the linked article), the driver was in a good mood, singing/humming, but clearly not paying attention to driving the car. The car itself was steady, well centered and oblivious to road cleaner on the side of the road. Anyone who has followed Tesla in the recent past knows that its Autopilot has a known inability to catch this very particular situation where an object or vehicle is stalled and partially blocking the lane. There have been several accidents, including one in China recently, that are tied to the same technology limitation.

In other words, preponderance of evidence indicates that this was a car running on Autopilot. Even if there is a small possibility that it is not, the customer is claiming that it is.

Doesn't it behoove that Tesla should have investigated this fatality more thoroughly and disclosed it much earlier? Note that acknowledging this fatality would mean that the Company's garbage statistic of 1 fatality in 130 million miles would read 2 fatalities in 130 million miles.

Instead, Tesla hides behind "plausible deniability."

If Tesla was diligent in proving this problem, or at least acknowledging that this could be an Autopilot problem, it would not have deliberately misled investors with impunity.

Should investors Trust Tesla at this point? How many of the other Tesla fatalities are related to Autopilot?

While Mr. Musk continues to play fast and loose with corporate governance, he should realize that his actions are increasingly putting Tesla under a massive legal risk.

An investor or attorney doing due diligence on Tesla can find on YouTube and Tesla Motor Club forums that there are hundreds, if not thousands, of incidents where Autopilot has failed or will fail consistently.

In spite of all these problems, Tesla is actively misleading customers by pretending that Autopilot is safe or that it cannot be sure if the accidents and fatalities are the responsibility of Autopilot.

In an interesting logical twist, Tesla finds a customer responsible for not intervening in a timely manner if there is potential for an accident. Tesla also holds customers responsible for any Autopilot shortcomings which lead to delayed response from customers which in turn lead to accidents.

But, customers will eventually wizen up to this chicanery. Tesla investors can be rest assured that, as more and more Autopilot miles get driven, massive Autopilot related lawsuits will engulf Tesla in short order.

While "beyond reasonable doubt" is a standard to meet in criminal cases, in civil cases, the burden of proof is far less rigorous. Courts across the US, and some international courts, will soon be finding that "preponderance of evidence" indicates Autopilot is to blame.

For the time being, Tesla's Autopilot program may continue to flicker under the shield of plausible deniability but, with ill-advised actions on many vectors, Mr. Musk is slowly but surely setting Tesla on a path to certain bankruptcy.

Note: Author is not an attorney. This is not a legal opinion.

Before it is here, it is on the Solar Insights subscriber platform. For timely and in-depth research and analysis of solar and battery industry stocks and developing news, please consider subscribing to our Solar Insights platform.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.