Friday, July 1, 2016

Opinion: My thoughts on the first-ever Tesla Autopilot fatality

Today I received the following question on my Tumblr account:


Yep, I heard about it all right – and the media backlash is, as per usual – negative. And now you have the inevitable “see it’s safer to drive the car yourself” people coming out, or the “ha! Look how dangerous it is” people, or the “self-driving cars are the devil” people. Just go on Facebook and you’ll see it.

But the fact of the matter is, THIS WAS THE FIRST TIME SOMEONE HAS EVER DIED DRIVING A TESLA ON AUTOPILOT.

In the two years since autopilot was introduced, Tesla Autopilot has helped avoid DOZENS of crashes that would have been caused due to sheer human negligence. Ironically enough, THIS VIDEO of Autopilot avoiding an accident was taken by the same man who was killed.

This other video is of Autopilot coming to a screeching halt when a driver decides to make a U-Turn right in front of it.

Like I said, Autopilot was introduced in 2014. In that time, exactly ZERO accidents have been deemed Autopilot’s fault. In all cases, it was the fault of the other driver OR the fault of the person using the car and *claiming* it was on Autopilot. Thing is, Tesla can easily check your car’s data logs and see if Autopilot was on.

Prior to this incident, the only other known instance of a Tesla not stopping while in “Autopilot” was a few weeks ago, when one rolled into a raised trailer. The car wasn’t in Autopilot, per se, but in “Summon Mode,” which drives at 1 mph usually just to park itself or pull into your driveway. The raised section of the trailer was above the car’s sensors, so would have only been in view of the front-facing camera, which is used for detecting speed limit signs.

Let’s look at the picture of the incident:


As you can see, the sensors of the car have no way of detecting the trailer bed. The car more-than-likely instead detected the wheels/lower bed of the truck, and had the trailer not been hanging over, the car WOULD HAVE STOPPED exactly 12 inches from the back of the truck.

To add more suspicion to the story, the car can only be in Summon Mode if you are standing nearby. The man at the center of this particular story claims he only stuck around his car for 20 seconds or so because a passerby asked him about the car, but that he didn’t activate the Summon. Tesla’s data logs show that he did, in fact, activate the Summon. And what would you do if someone curious about your Tesla came up to you? You’d do a quick demo, right? This is most likely what happened, but of course he’s denying it because he doesn’t want the repercussions and embarrassment.

Anyway, what this expressly shows is that Tesla clearly needs to add more sensors capable of detecting higher objects like this trailer bed.

So what does this have to do with the death?

Everything…and nothing.

Obviously, the man who died had owned his car for a long time, and he’s proven that it can avoid accidents…under normal conditions. The conditions of this accident were similar to those of the scenario I just described: the trailer was too high for the sensors to detect is as a tractor trailer crossing the highway. Instead, it gave Autopilot’s radar the impression that it was nothing more than, and I’m quoting Elon Musk, “an overhead road sign.” Had the impact been near the front or rear of the truck, Autopilot would have noticed the large obstruction and come to a halt. One popular reaction to this situation is the fact that Elon Musk once denounced the use of LIDAR on Tesla vehicles, a detection system similar to radar that instead uses lasers and more-than-likely would have detected the tractor trailer. Maybe after this incident, LIDAR will considered as a new detection system.

The fact of the matter is this accident happened in broad daylight and there are dozens of factors that come into play. Lighting, speed, when the driver of the trailer decided to cross the highway, and of course, the attentiveness of the man driving the Tesla. CLEARLY he was not paying attention. The brakes were never applied, either manually or autonomously. I, for one, would slam on the brakes no matter what in that situation – so clearly he was preoccupied.

And this is what Tesla keeps warning about. DO NOT LET AUTOPILOT DO ALL OF THE DRIVING. You NEED to pay attention no matter what you’re doing. It’s still in its early years, and what is happening right now is that Autopilot and Tesla are both LEARNING. There are some situations you simply cannot prepare for until they happen. Then you know, okay, we need to have something that will prevent that in the future. It’s impossible to think of every situation. You can think of a thousand different ways to avoid crashes, and there will ALWAYS be one you miss. And until every car on the road is autonomous, Autopilot will only ever be a driving assistant – NOT a driver.

In the US alone, there are 32,000 driving-related deaths every year. EVERY YEAR. (For perspective, only gun-related deaths kill more people in the US: 35,000 people/year). That’s nearly 100 people who die in a car accident every single day. Tesla’s Autopilot has been around for two years, and it’s the first time someone has EVER died while using it. In fact, only 5 people have been known to have died while at the wheel of a Tesla: 2 people drove off cliffs, 1 person was killed when a dump truck slammed into him, and 1 person who STOLE the car was killed after hitting a pole at 100 mph. Please note that Autopilot was not active in ANY of these situations. They were ALL human error.

So yes, I think because of Tesla’s rising status in the tech and automotive industries, there is going to be a huge backlash of nay-sayers and conspiracy theorists and people who just don’t trust new tech anyway. But it’s unqualified backlash. Elon and Tesla were investigating the crash before the NHTSA began investigating it, and Tesla is already working to fix the problem so that Autopilot becomes more reliable and understands more situations like this.

Tesla updates its software and hardware through continuous data collection. They improve, improve, improve. Autopilot and Tesla vehicles will only become safer because of this crash, despite its unfortunate result.

Some important numbers:

  • 1.2 billion miles: the total distance owners have driven in Tesla vehicles.
  • 130 million miles: the total distance Tesla owners have driven using Autopilot, with only 1 death.
  • 94 million miles: the average distance traveled by all car owners in the US between fatal car accidents.
  • 60 million miles: the average distance traveled by all car owners in the world between fatal car accidents.
  • 88: how many people die in car accidents in the US every day
  • 3,200: how many people die in car accidents in the world every day
  • 5: the number of deaths recorded in Tesla vehicles in the ten years since the first Tesla vehicle arrived on the road
  • 1: the number of deaths recorded while a Tesla vehicle was in Autopilot since its introduction two years ago.

No comments:

Post a Comment