Fatal Flaw Edition – The Problem of Autonomous Car Ethics

Facebooktwitterredditpinterestlinkedinmail

THE CAR COACH REPORTS:

October 09, 2015

 

Fatal Flaw Edition – Autonomous Car Ethics

 

 

 

Google car
photo by Becky Stern https://www.flickr.com/photos/23243094@N00/8612888280/

 

 

Aside from robbing us of the joy of driving, one of my chief complaints with autonomous cars is an ethical one. How does a computer decide who lives and dies in an accident? Can a computer be programmed to break the law to save a life? Is this even navigable or is it the fatal flaw of the autonomous drive technology?

 

Stanford professor Chris Gerdes aims to find out: http://www.autonews.com/article/20151009/OEM06/151009815/a-stanford-professors-quest-to-fix-driverless-cars-major-flaw

 

I must say, this is pretty cool. And here’s why: I resent the auto industry telling us (in my best Han Solo impression) “everything’s perfectly all right now. We’re fine. We’re all fine here now, thank you. How are you?” when it comes to autonomous cars. At least someone with a capable brain is thinking through all the angles of what having self-drivers on the roads means. It’s about time.

 

Oh, and there is this: Professor Gerdes is a driver, folks. That’s the other perk. He grew up in Charlotte, NC near the Speedway and trained to be a racecar driver. He also requires his students to race, too, because he believes that in order to program a car in good conditions you have to know it in extreme conditions, the article says.

 

Since he is a driver and is pro-driverless (here we disagree, but this is America so it’s fine), he is able to help the industry that insisted the tech was ready (it isn’t) see that not only are there questions that remain unanswered, he is capable of asking the right questions.

 

The answers to those questions are slightly more complex than a computer would like. For instance, how do you program a vehicle to break the law (in the article, a test involved crossing a double yellow line to avoid missing a simulated road crew) for safety reasons, but not for anything else? Can you even do that? Insert myriad other scenarios here paired with a machine that can only do as it’s programmed.

 

With automakers pledging autonomous drive vehicles by 2017, this research couldn’t be more important than it is right now. I still want to drive my car, but at least I know a neutral party is showing the auto industry that there is still work to be done.

 

But what do you think? Can you program a computer to make ethical decisions and respond to a dynamic environment like the open road considering the safety of occupants and other people on the road?

 

Post your comments!

 

 

 

My Final Thought:

 

A Ferrari is a Ferrari because it’s a Ferrari and American hot rods are American hot rods because of that throaty rumble we love. The two should never meet. Ever.

 

The owner of this 1963 GTE somehow didn’t get the memo:

http://www.autoblog.com/2015/10/08/anti-purist-1963-ferrari-gte-sports-hot-rod-chevy-v8/

 

Some people just want to watch the world burn, I guess.

 

Post your comments below!

 

Love Your Car! See you next week!

 

–Lauren Fix

 

Leave a Reply

You must be logged in to post a comment.