When you drive a car, there is quite a few things that can get in your way. For some of them it’s safer to just ignore them. These are birds, flying plastic bags or newspapers. There are other things which you learn to respect. Bisons, mooses, that sort of things.
Where is a pedestrian on this scale? Truth to be told, most often they will try pretty hard not to be in your way, especially if you travel at 43 mph. Sometimes they don’t though, they will avoid eye contact 1 and go full bison and most drivers will rather adjust their trajectory than kill them.
The AI in a self-driving car must make the same decision and as it turned out in the just published report on the death of the pedestrian struck by the Uber self-driving car, it can easily make a mistake in a border case:
According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.
I can sympathize with the classification model: the probability that the pedestrian will do something to avoid the crash is likely bigger than the probability they would just let it happen. But then maybe the uncertainty should lead to some slowing down or changing the lane just in case. This did not happen and then this:
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior.
When the AI realized that this was a bison situation, it did nothing. It didn’t honk or notify the operator, just crashed full speed into the person. Uber then released a video that looked like nobody could see the person until the last moment, but as critics pointed out then and as is clear from the report, the system knew about the pedestrian for more than enough time to avoid this. I guess you might say that the problem was that the AI didn’t care.
-
Contrary to popular belief, avoiding eye contact might be safer when crossing. ↩