Thursday, May 24, 2018

Unless You Don't Program Them To Do That

I've long said safety isn't the real issue with self-driving cars, in that if they work they'll be safe enough, and that programming them not to hit things has to be the bare minimum easiest thing to do. Even this isn't *that* easy as there is a bit of a problem at high speeds. They don't actually see that far ahead at the moment. Still. "If see object, brake or turn." Not hard.

Unless, of course, you don't tell them to do that.

Uber’s vehicle used Volvo software to detect external objects. Six seconds before striking Herzberg, the system detected her but didn’t identify her as a person. The car was traveling at 43 mph.

The system determined 1.3 seconds before the crash that emergency braking would be needed to avert a collision. But the vehicle did not respond, striking Herzberg at 39 mph.


And why was that? Oh.

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

There's a lot of chatter about where exactly the civil liability is going to fall for these things. What about the criminal liability?