Site icon Singularity Hub

Man Dies in Collision While Being Driven by Tesla’s Autopilot System

A Tesla blog post describes the first fatality involving a self-drive system. A Tesla was driving on autopilot down a divided highway. A truck taking a left turn off the highway crossed into the Tesla’s lane perpendicular to the car. A white truck body against a bright sky is not something the camera system in the Tesla perceives well. The truck’s clearance was high, so when the Tesla did not stop, it went under the truck and the windshield was the first part of the Tesla to hit the truck body, with fatal consequences for the “driver.”

Tesla points out that the autopilot system has driven 130 million miles, while human drivers in the US have a fatality about every 94 million miles (though it’s a longer interval on the highway). The Tesla is a “supervised” system where the driver is required to agree they are monitoring the system and will take control in the event of any problem, but this driver, a major Tesla fan named Joshua Brown, did not hit the brakes. As such, the fault for this accident may reside with Brown, although the investigation is just beginning. (Tesla also notes that had the front of the car hit the truck, the crumple zones and other safety systems would probably have saved the driver — hitting a high target is the worst case situation.)

It is worth noting that Brown was a major Tesla fan, and in fact, he is the person in a video that circulated in April claiming that autopilot saved him from a collision with a smaller truck that cut into a lane.

Any commentary here is preliminary until more facts are established, but here are my initial impressions:

Camera vs. Lidar and Maps

I have often written about the big question of cameras vs. LIDAR. Elon Musk is famously on record as being against LIDAR, when almost all robocar projects in the world rely on LIDAR. Current LIDARs are too expensive for production automobiles, but many companies, including Quanergy (where I am an advisor) are promising very low-cost LIDAR systems for future generations of vehicles.

Here there is a clear situation where LIDAR would have detected the truck. A white truck against the sky would be no issue at all for the LIDAR, it would see it very well. In fact, a big white target like that would be detected beyond the normal range of a typical LIDAR. That range is an issue here — most LIDARs would only detect other cars about 100m out, but a big white truck would be detected a fair bit further. Either way, that’s not quite far enough to stop in time for an obstacle like this at highway speeds, however, the car would brake to make the impact vastly less, and a clever car might even have had time to swerve or deliberately turn to hit the wheels of the truck rather than slide underneath the body.

Another sensor that is problematic here is radar. Radar would have seen this truck no problem, but since it was perpendicular to the travel of the car, it would not be moving away from or towards the car, and thus have the doppler speed signature of a stopped object. Radar is great because it tracks the speed of obstacles, but because there are so many stationary objects, most radars have to just disregard such signals — they can’t tell a stalled vehicle from a sign, bridge or berm. To help with that, a map of where all the radar reflectors are located can help. If you get a sudden bright radar return from a truck or car somewhere that the map says a big object is not present, that’s an immediate sign of trouble. (At the same time, it means that you don’t easily detect a stalled vehicle next to a bridge or sign.)

One solution to this is longer range LIDAR or higher resolution radar. Google has said it has developed longer range LIDAR. It is likely in this case that even regular range LIDAR, or radar and a good map, might have noticed the truck.


This article was first published on Brad’s blog. Go here to read the original article.

Image credit: Tesla

Exit mobile version