The last few years as the layman media has cottoned on to the previously silent revolution happening in self driving car technology since 2003 and DARPA's "grand challenges", we've seen lots of introduction of arguments expressing the necessity of ethics and philosophy to help deal with supposedly dangerous ramifications of a cars that drive themselves. Namely issues like what is known as the trolly problem.
I'll be blunt, there is no need to address any moral dilemma at all.
Self Driving cars don't need to be that intelligent, all they need to do is know and relentlessly follow the law.
The laws work to define what is legal *action* given possible scenarios with other cars and pedestrians...acting within those laws 100% means one is not subject to violating them....so knowing the laws and behaving to their letter ....*even if that means killing people* will get you free of at least the litigation.
See China.
In China a backward insurance payout philosophy coupled to laws that enforce it, has it so that in some cases it is better that a struck pedestrian is killed than just hit...so in that nation car drivers often make sure that if they do hit any one they kill them to be free of the associated potential litigation if they survive.
I'm not making this up, read:
So powerful is LAW over moral machinations in this regard, here in the west where the laws are not so anti Pedestrian they are still just as inflexible to being retroactively gamed.
Remember what Kaitlin Jenner did, thousands of law abiding citizens hit and kill pedestrians on the roadways every year and get away with it scott free....why ? Because they were found to be following the law as the incidents unfolded and that is all one needs...so you don't teach the car to swerve to avoid person a when group b is on the detour, you teach it to follow the law...if it swerves and stays on the road but kills the single pedestrian it followed the law and is not responsible. If it stays lane forward and kills the group, it still followed the law and is not responsible.
Of course it should make an effort to reduce speed in either circumstance to demonstrate that it *tried* (not doing this could be seen as culpability to commit the dangerous act) but that doesn't require morals....just a simple response heuristic to slow down when objects present immediately ahead of the car and there is no ability to turn .... SDV's already do this simple heuristic very well and far far faster and more accurately than any human is capable of.
Now can we stop sharing these silly moral argument articles?
Comments