With the exception of "The Back Alley", CIVIL DISCUSSION IS EXPECTED.
1 of 1
Online!
Should a driverless car kill its passenger to save five strangers?
Programming self-driving cars isn't just about teaching them to avoid obstacles and navigate intersections. It also involves teaching them how to make tough ethical choices.
"A driverless car is on a two-way road lined with trees when five kids suddenly step out into traffic. The car has three choices: to hit the kids, to hit oncoming traffic or to hit a tree.
The first risks five lives, the second risks two, and the third risks one. What should the car be programed to choose? Should it try to save its passenger, or should it save the most lives?"
"Would you be willing to get in a car knowing it might choose to kill you?"
SHOULD driverless cars be programmed to make this type of decision ??
Who would be responsible if a driverless car killed someone ??
Offline
In that situation, driverless cars should obey the laws. The car should stay on the road in it's own lane and brake hard. The five kids were breaking the law. It seems unethical to risk the lives of law abiding people to save the lives of law breakers.
An electric driverless car has no engine in front, so the hood can be low and soft. It could hit pedestrians low and cause them to tumble over the car. That would be safer than being hit by something with a higher hood.
Offline
Photos of EV with low hood, and SUV with higher hood. It looks like the EV would be safer for hitting pedestrians.
1 of 1