Should a driverless car kill its passenger to save five strangers?
Programming self-driving cars isn't just about teaching them to avoid obstacles and navigate intersections. It also involves teaching them how to make tough ethical choices.
"A driverless car is on a two-way road lined with trees when five kids suddenly step out into traffic. The car has three choices: to hit the kids, to hit oncoming traffic or to hit a tree.
The first risks five lives, the second risks two, and the third risks one. What should the car be programed to choose? Should it try to save its passenger, or should it save the most lives?"
"Would you be willing to get in a car knowing it might choose to kill you?"
SHOULD driverless cars be programmed to make this type of decision ??
Who would be responsible if a driverless car killed someone ??