The Moral Machine
The Moral Machine
Very interesting. Each run-through just gives you 13 randomly generated scenarios, so some are obvious and easy, some less so. Based on the random scenarios I got, my first run said I was far more likely to save men than women. The second run-through said I was far more likely to save fat people than athletes.
All I know is my food tastes better when I take my food-tastes-better pill.
- Tahlvin
- Scottish Joker
- Posts: 5397
- Joined: Mon Apr 11, 2016 7:31 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: The Moral Machine
I based my answers on pretty much always driving straight rather than attempting to swerve. It gave me some interesting results regarding my "preferences."
Saving more lives was very important to me, apparently.
Protecting the lives of the passengers was very important to me.
It apparently said I have a very strong preference towards protecting men (I didn't even look at gender when making my decisions, but whatever).
I strongly preferred fit people to fat people.
Social value didn't mean much of anything to me.
Saving more lives was very important to me, apparently.
Protecting the lives of the passengers was very important to me.
It apparently said I have a very strong preference towards protecting men (I didn't even look at gender when making my decisions, but whatever).
I strongly preferred fit people to fat people.
Social value didn't mean much of anything to me.
Wash: "This is gonna get pretty interesting."
Mal: "Define interesting."
Wash: "Oh, God, oh, God, we're all gonna die?"
Mal: "Define interesting."
Wash: "Oh, God, oh, God, we're all gonna die?"
- Stan
- Ninja Carpenter
- Posts: 753
- Joined: Tue Apr 12, 2016 5:06 am [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: The Moral Machine
I found it rather contrived and off. Who is going to look at the type of people in the way or calculate the number of people in the car vs how many are going to get hit. AIs are not going to be able to do that either as object recognition isn't that good.
My rules, in order, were
hit a barrier instead of people - realistically, people in a car hitting a barrier are far more likely to be ok that people getting hit by a car. The scenarios seemed to consider them equally bad.
Avoid people if possible, even if there's an animal in the way.
Go straight instead of being fancy.
My rules, in order, were
hit a barrier instead of people - realistically, people in a car hitting a barrier are far more likely to be ok that people getting hit by a car. The scenarios seemed to consider them equally bad.
Avoid people if possible, even if there's an animal in the way.
Go straight instead of being fancy.
Re: The Moral Machine
Yeah. With only 13 questions, your "preferences" can be part of the luck of the draw. After four or five tries, I've seen that gender, social value, fitness, age, etc is almost always at zero, indicating no preference. But once in a while, it just happens to put all the old people in the crosshairs.
My criteria were:
1. Save the most human lives.
2. If the number of lives is equal, I will choose to save "law-abiders" over "law-breakers". However, since the law in question is jay-walking, I question the moral value of this judgment now.
3. All of the above being equal, I choose no action over swerving.
However, choosing no action over swerving is simply my way of making "no choice." However, failing to act is still a choice. And I have the gut reaction that tells me swerving is inherently more dangerous than driving straight, but since the outcomes are already laid out for us, that's just an imaginary factor here.
To be honest, the way these scenarios are randomly generated, choosing to drive straight instead of swerving is almost always a decision to save passengers over pedestrians. But it's the passengers who are in the broken vehicle responsible for the situation in the first place, and the pedestrians are merely people crossing a street. Is there an argument to be made that if loss of life is equal either way, the vehicle should choose to save "innocent" bystanders instead of its own passengers?
And hitting a barrier results in the deaths of all passengers. That's predetermined in our simplified choices.
However, as my son pointed out, when these become real situations, hopefully we are at a point where self-driving cars can choose to crash, trusting vehicle safety technology to keep passengers safe instead of running over people in almost sure-death scenarios.
And I think you are very wrong in saying that AI's will not be able to discriminate well enough to make these determinations. I think deciding based on health level or occupation is beyond their capabilities, but are we that far away from a world where any AI can identify you by the smartphone in your pocket and have all of this information at the ready? I'm hoping our privacy laws keep that from happening, but at the rate modern people are willing to give up their privacy for convenience, I don't think this is far off at all.
My criteria were:
1. Save the most human lives.
2. If the number of lives is equal, I will choose to save "law-abiders" over "law-breakers". However, since the law in question is jay-walking, I question the moral value of this judgment now.
3. All of the above being equal, I choose no action over swerving.
However, choosing no action over swerving is simply my way of making "no choice." However, failing to act is still a choice. And I have the gut reaction that tells me swerving is inherently more dangerous than driving straight, but since the outcomes are already laid out for us, that's just an imaginary factor here.
To be honest, the way these scenarios are randomly generated, choosing to drive straight instead of swerving is almost always a decision to save passengers over pedestrians. But it's the passengers who are in the broken vehicle responsible for the situation in the first place, and the pedestrians are merely people crossing a street. Is there an argument to be made that if loss of life is equal either way, the vehicle should choose to save "innocent" bystanders instead of its own passengers?
And hitting a barrier results in the deaths of all passengers. That's predetermined in our simplified choices.
However, as my son pointed out, when these become real situations, hopefully we are at a point where self-driving cars can choose to crash, trusting vehicle safety technology to keep passengers safe instead of running over people in almost sure-death scenarios.
And I think you are very wrong in saying that AI's will not be able to discriminate well enough to make these determinations. I think deciding based on health level or occupation is beyond their capabilities, but are we that far away from a world where any AI can identify you by the smartphone in your pocket and have all of this information at the ready? I'm hoping our privacy laws keep that from happening, but at the rate modern people are willing to give up their privacy for convenience, I don't think this is far off at all.
All I know is my food tastes better when I take my food-tastes-better pill.
- Cazmonster
- Silent but Deadly
- Posts: 1845
- Joined: Mon Apr 11, 2016 6:06 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: The Moral Machine
I saw this a while ago. My vote was always to put the passengers in jeopardy. They have the crash protection of the vehicle. Also, you know, brakes.
"...somewhat less attractive now that she's all corpsified and gross."
Re: The Moral Machine
The description tells you that the brakes have failed. Stopping is not an option. It also says that if they crash, they die. Crash protection is not a factor.
Where this gets even more interesting is when a vehicle can account for various safety factors and calculate percentage chances of injury and death for each person and base decisions on that.
Where this gets even more interesting is when a vehicle can account for various safety factors and calculate percentage chances of injury and death for each person and base decisions on that.
All I know is my food tastes better when I take my food-tastes-better pill.
- Cazmonster
- Silent but Deadly
- Posts: 1845
- Joined: Mon Apr 11, 2016 6:06 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: The Moral Machine
"...somewhat less attractive now that she's all corpsified and gross."
Who is online
Users browsing this forum: No registered users and 148 guests