It's Tuesday and you're looking for a way to waste some time between Facebook posts. Maybe you should check out Moral Machine, a new game from the MIT Media Lab that puts you in the role of a self-driving car.
Why would you want to do that? To experience some of the many moral dilemmas autonomous vehicles will face.
Autonomous cars are on the tip of everyone's tongue these days, and according to automakers, they'll soon be parked in garages around the globe. But first, software developers have to "teach" self-driving cars a few things, like how to drive in rain and fog, and how to drive like a human.
And of course: when to kill.
There's been plenty of discussion about that last point in recent years, as everyone has come to realize that sometimes, self-driving cars will be forced into unwinnable situations. The human driver of another vehicle will veer out of her/his lane, unmapped construction zones will present unforeseen obstacles, or a horde of puppies will bound into the road.
In cases like these, self-driving cars will have to make split-second ethical decisions. But unlike humans, who might sometimes pick the lesser good of saving themselves instead of the van full of retirees, computers in autonomous cars will need to abide by strict moral codes.
This is, of course, the kind of stuff that keeps people like Elon Musk and Stephen Hawking awake at night. If computers aren't endowed with the right kind of artificial intelligence, they could make some awful decisions on our behalf (cf. I, Robot or Ex-Machina, among many, many other films.)
Are humans equipped to take on that programming task? If Microsoft's recent foray into AI is any guide, the answer is a resounding, flat-out, absolute no.
And that's why MIT's Media Lab launched Moral Machine. In it, you're the AI behind a self-driving car, and your task is to determine who lives--and who dies--in given scenarios.
Don't feel like the set-ups are complicated enough? You can even design your own.
MIT doesn't suggest that responses to the online game will have any impact on AI or autonomous car development, but it's interesting to see the dilemmas they're attempting to work through.
You can learn more and play a round (or seven) of Moral Machine here.