“Accountability” and “Countability” – Misdirection in the “Autopilot” Safety Debate

7 Jul

To be clear, Tesla and Elon Musk were referring to “autopilot assisted”, and not “fully autonomous” driving when claiming that only one Tesla fatality has occurred in 130 million autopilot assisted miles driven. The same is true of Musk’s later (quite bizarre and scientifically unproven) statement that half a million lives would have been saved around the world had everyone been driving said Teslas. For this he presumably considered the 1 million worldwide auto fatalities per year, occurring at a rate of one for every 60 million miles driven.

Per Musk’s insistence, I thought I had better do the math for myself. Right off the bat, there is a major issue with his claim. Musk had at his disposal a Tesla sample size (everything to date) capable of producing just one death, and this was then compared to a yearly sample size large enough to have resulted in one million deaths (in the case of the worldwide figure). After allowing for Tesla’s relatively higher number of miles per death (130 million divided by 60 million), and then dividing the total worldwide deaths by this amount (1 million/2.16), you end up with the fact that the worldwide totals arose from a sample size roughly 460,000 times larger than Teslas’! You can think of it this way – if Tesla was to have had just one more fatal accident in which two passengers were killed – they would be down to one fatality per 43.3 million miles driven. Would Musk then be proclaiming that if everyone around the world had been driving autopilot assisted Teslas there would have been an additional 384,000 people killed?

And, what exactly were the “controls” in place for this comparison to be valid? Are Teslas equally distributed among the countries included in the worldwide figure? Is Tesla’s autopilot currently capable of handling all of the locally specific traffic laws and infrastructure as they currently exist around the world? In the U.S., we have 50 separate states doing their own thing when it comes to passing legislation. What about these further distinctions around the world?

STOP !!! ……. (Note to Self) ……. Enough already with the careful and literal translation of Tesla and Musk’s statements concerning the safety level of their system! I need to now continue writing this under the assumption that what most people actually heard was “self-driving” car, not “autopilot assisted”. Why do I feel this way? Because I am a pretty smart guy with a background in quality assurance and a demonstrated interest in road safety – and even I made this mistake before I carefully reread the claims.

Most troubling is the ease with which these claims were made, and the lack of any proactive clarification on the part of Tesla. This is likely symptomatic of a greater problem as put forth earlier in my essay “Testing ‘Self-Driving’ Cars – The Buck Stops Where?!” also located here at www.edmundmiller.wordpress.com

To a most useful end, I would like now to explain how far off, and scientifically invalid these safety claims would be if a person was to (incorrectly) apply them to the idea of “hands off”, “fully autonomous” operation.

To misapply the data would be to neglect all of those instances in which Tesla drivers quickly corrected a potentially fatal mistake made by autopilot. Some examples can be found on Youtube and I suspect there have been more than reported publicly. Examples include the car suddenly attempting to exit the highway at the last second, or the car continuing to follow the car in front instead of staying in lane.  If just a few of these (otherwise fatal) events have occurred (to date) it would represent a massively lower “fully autonomous” safety level. What is so disturbing is that Tesla is not openly disclosing this type of data.  Not collecting it would be even worse.  Anyone out there know the answer?  If this data has been neglected, it would fit expectations considering that the Federal D.O.T. announced they would be relying on the self-reporting of auto manufacturers when verifying “driverless-car” safety. (Again – see my earlier essay).

In fact even if the above (calculated) considerations were to be added in at this point, the adjusted safety level would still be massively under estimated.  Currently Tesla’s autopilot is only recommended for less complicated scenarios such as highway driving.  Not included are the greater complexities encountered with city driving, pedestrian traffic, construction sites, police-directed situations, emergency maneuvers, and more.  A person need only imagine the potential for these to change the overall safety rating.

So, as your average idiot can now see – there is absolutely no way to obtain an accurate guestimate to the question “How safe (more specifically ‘deadly’) does the ‘hands off’ operation of Tesla’s autopilot appear to be at this point in time?” when using only the data at hand.  If Tesla, in fact does have the answers to this more fully considered question, they should be willing to discus it with any reporter who inquires.  If they don’t, then we should all be pointing our fingers at the Federal D.O.T asking “Just exactly what is going on with the testing of this ‘Self-Driving’ stuff?!”  (Again – read my earlier essay) ……… So there you go …….Reporters get to work now!

%d bloggers like this: