...is apparently drivers!
Insurers have made every effort to encourage safer driving in their policy holders - no claims bonuses being the biggest indicator of this - but you can easily be involved in an accident through no fault of your own. Clearly the problem is the same for computer controlled vehicles...
As a motorcyclist for several years, and having had my fair share of scrapes, I often pointed the finger at other people when an accident happened. I stick to four wheels now as the roads have become more crowded, but I am willing to admit that had I been more vigilant and expected the worse of everybody else I would probably have had fewer repair bills.
So is the next challenge to programme cars to believe everyone is out to get them...?
The self-driving car, that cutting-edge creation that’s supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers. The glitch? They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well. As the accidents have piled up — all minor scrape-ups for now — the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?