Car News

Your future self-driving car might kill you

We always knew that eventually we’d see self-driving cars on the road one day, perhaps by 2030 or so, but during the past year or so things have really ramped up significantly. A sort of ‘arms race’ is emerging between various vehicle manufacturers such as Tesla, BMW and Mercedes-Benz, each with their own cache of ‘self-driving tech’ which is gradually taking the task of driving away from the person sitting behind the wheel. Right now we have cars which can drive themselves in stop start traffic, judge their speed based on radar readings reflected off the cars around them and stick to their own lane based on feedback from cameras around the vehicle. Better still, when you get to your destination they can park themselves, too. Soon all you’ll have to do is turn the thing off, step out and walk away.

Technologies like these will help our driving become safer, cleaner and more fuel efficient. But before the technology can progress further and hopefully result in cars which can handle all aspects of driving, that is with zero help or input required from the driver, vehicle manufactures must first solve an impossible ethical dilemma of algorithmic morality. If you’re not sure what that means, it can be more easily thought of as what the car should be programmed to do in the event of an unavoidable accident.

self-driving-car-kills

Picture this hypothetical situation. Your new self-driving car is transporting you along a regular suburban road with an 80kmh+ (50mph) speed limit, when an oncoming truck suddenly swerves across into your lane at the very last second. The self-driving system in your vehicle will be faced with a dilemma, and in just a few milliseconds make a decision on what it is going to do. The vehicle is about to be involved in an accident which most likely going to be fatal for you, with the only alternative being to swerve off the road to avoid the collision. But it has already detected two pedestrians walking there, so what should it do?

Should it minimize the loss of life and simply apply the brakes, sending you straight under the front of the truck and sacrificing your life in order to save others, or should it protect your life at all costs and swerve anyway, potentially killing the two pedestrians? Either way, you’d be sitting there as a passenger, probably browsing the internet on your smart phone, and there’s nothing you could do about it. As humans we’re programmed to look after ourselves, but self-driving cars probably won’t think the same way. Who would buy a car that has been programmed to potentially sacrifice you, the owner?

To get answers, a gentleman by the name of Jean-Francois Bonnefon at the Toulouse School of Economics in France got together with a couple of friends and made an effort to find out. Bonnefon says that while there is no right or wrong answer to the question, public opinion will play a strong role in how future self-driving vehicles will react. To do this, they posed a bunch of different ethical dilemmas to a large number of people and collated the responses. Their idea is that the public is much more likely to go along with a scenario that aligns with their own views.

The participants were given scenarios similar to the one above, in which one or more pedestrians could be saved if the car was to swerve into a barrier instead, killing its occupant. At the same time, the researchers varied some of the details such as the actual number of pedestrians that could be saved, whether the driver or an on-board computer made the decision to swerve and whether the participants were asked to imagine themselves as the occupant or an anonymous person.

In general, the participants were comfortable with the idea that self-driving vehicles should be programmed to minimize the death toll. That, after all, is surely the whole point of designing self-driving vehicles in the first place – to engineer human error out of the equation. But the participants were willing to go only so far. “Participants were not as confident that autonomous vehicles would be programmed that way in reality—and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves,” Bonnefon and his team concluded.

So essentially, people are generally in favour of cars that sacrifice the occupant to save other lives, as long they don’t have to drive one themselves. Bonnefon and co say these issues raise many important questions: “Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the car, than for the rider of the motorcycle? Should different decisions be made when children are on board, since they both have a longer time ahead of them than adults, and had less of a choice to be in the car in the first place? If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm’s decisions?”

These problems cannot be ignored, say the team: “As we are about to endow millions of vehicles with autonomy, taking algorithmic morality seriously has never been more urgent.”

Related posts

New Mazda CX-8 diesel now available

Sean McKellar

Jeremy Clarkson’s love affair with old Lancias

Sean McKellar

Nissan is working on brain-to-vehicle technology for future vehicles

Sean McKellar

New BMW i3’s traction control system 50 times faster; will feature on all future BMW & MINI models

Sean McKellar

Gran Turismo Sport – Top Gear Nurburgring Van Challenge

Sean McKellar

Jeremy Clarkson partners with Emirates and Mercedes

Sean McKellar

6 comments

the guy with no name October 24, 2015 at 1:08 am

I don’t drive yet, but I’d never let a robot decide my fate. I would never put my life in the hands of a robot. Also, people who like driving will never buy a car like that.

There is also a problem with the above problem. How would that truck swerve into your lane if it too is in autopilot?

Reply
Paul J Santos October 24, 2015 at 3:28 am

There are many reasons why autonomous why the truck could swerve into your lane, perhaps it is trying to avoid pedestrians, school busses, babies in strollers, or it simply had a flat tire and could not compensate.

I think the only autonomus vehicles should be on RAILS so that they can’t get loose and kill us all.

Reply
Paul J Santos October 24, 2015 at 3:31 am

There are many reasons why the autonomous truck could swerve into
your lane, perhaps it is trying to avoid pedestrians, school children,
babies in strollers, or it simply had a flat tire or hit a patch of ice.

I think the only autonomous vehicles should be on RAILS so that they can’t get loose and kill us all.

Reply
Sean@MotoringBox October 24, 2015 at 5:23 pm

There’s going to be many variables such as slippery road surfaces or tyre blow outs like you mentioned, but also human-error causing flaws in the autonomous driver code, dirty or malfunctioning sensors, or the prospect of vehicles being hacked.

Reply
Sean@MotoringBox October 24, 2015 at 5:16 pm

There’s going to be considerable overlap when it comes to self-driving vehicles and maually driven vehicles co-existing on the roads together. For example, let’s say that by 2040 every single new car and truck on the market is completely autonomous, what’s going to happen with all the manually driven vehicles which have been sold up until that point?

Unless laws are passed to remove those older vehicles from the road, we could easily be faced with the prospect of shared roads until the end of the century at a minimum.

Reply
Vasudev August 2, 2016 at 11:23 pm

Just removing old vehicles, may not be an option, who will part with a car which was bought few years ago and in prestine condition, simply because govt says so. Provision to add self driving should be coming independent of the cars.

Reply

Leave a Reply to Paul J Santos Cancel Reply