Monday, September 03, 2018

Not Our Fault

Programming these cars to obey they laws, drive cautiously, and to not run into things (though this last one seems to be not quite as easy as people think at higher speeds), is the easy(or easier) part. The "oh they'll be safer than humans!" (not necessarily true, but ok) crowd has to grapple with the idea that they can be technically safe and legal themselves but cause chaos around them. It's rarely your fault, legally, if you get rear-ended, but it doesn't mean it wasn't actually your fault.

The first car crash experienced by Apple's fleet of self-driving vehicles happened just last week and it was apparently caused by a human driver — not Apple's own technology.


Apple's vehicle was merging onto the Lawrence Expressway and was moving at less than 1 mph, according to the report, and the Nissan was moving at 15 mph when it hit the self-driving car. The self-driving car's speed seems quite slow for merging onto a high-speed expressway, but details are sparse in the report so we don't know for sure if its speed was reasonable — the only information we get is that the vehicle was "waiting for a safe gap to complete the merge" when it was struck.

I don't like that first paragraph. "Caused" is complicated. Throwing a bunch of these things on the road which behave legally but not necessarily like humans and sometimes not sensibly is a bit like throwing a lot more Vespas onto the road (though not for precisely the same reasons). I'd expect a lot more accidents to happen then, too, even if those Vespa drivers all behaved perfectly legally. That doesn't mean "ban Vespas" or "ban self-driving cars," it just means that even if they behave safely and legally, they can still complicate the system.