From the episode, which unfortunately I have not had time to complete but I did happily get to start: the self-driving car that killed the pedestrian.
Here is a question for you: there's a driver in the car and presumably the driver is supposed to notice that the car is not engaging appropriately to avoid the pedestrian, and then seize the controls. But there are big problems with this, already explored in other occupations and industries. The less involved the human operator is, the more difficult it is to keep that person's attention focused on the driving just in case he or she has to take over. And the more partially involved the human operator is, the more difficult it is to know when it is the human's "turn", so to speak. Attention drift isn't eliminated this way either. So what's the answer? People are both bad drivers and bad monitors of robot drivers.
Re: Uber, I think this is their car, their self-driving tech responsible, and the driver likely is a juiceless turnip; imo they should settle as quietly as possible and give up their big dream of being first to hit this market.
Self-Driving Cars
Re: Self-Driving Cars
I saw the video and immediately realized that if I was driving, I would have hit and killed that person too.
- mimekiller
- Standard Bearer
- Posts: 1713
- Joined: Mon Jun 27, 2016 6:16 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: Self-Driving Cars
Yea the details of this incident seems to be someone bolting from the shadows in a way that would be unlikely to end well no matter who or what was behind the wheel.
- Phoebe
- Canned Helsing
- Posts: 7208
- Joined: Tue Nov 15, 2016 9:42 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: Self-Driving Cars
It may be that a human driver would have killed the person. Or maybe not, because what is visible through that camera is not the same thing as what would have been visible to a person driving down the street. For one thing, the street is far more illuminated than it showed on that video, as has been proven by other people posting images of what the street actually looks like at night. For another, when you're driving in an area that has visible pedestrians anywhere, if you're a decent driver you're not going to be going 5 miles over the speed limit anyway like this person was, since apparently neither car nor driver was capable of paying attention to the surroundings at that time. In addition, people deal with a more sudden situation with deer all the time and sometimes they hit them and sometimes they don't.
But either way the point is not that a person would have done better; the point is that one of the most important reasons we are to accept the cars is that they do better than people. But do they? People are not only poor drivers, but they are notoriously poor monitors of autonomous systems like these self-driving cars. These are things the military, for instance, has had to learn how to counteract, because of difficulties with the way humans pay attention to things, or not. If there is any role for a human monitor, then we have to address the two forks of that problem: minimum monitoring goes along with minimal attention paid to the monitoring task, and maximum monitoring goes along with confusion over when the autonomous system is responsible and when the human is supposed to take a turn at controlling it.
Tangentially, I was stuck for a very long time in a traffic jam caused by an accident today. I don't know whether it was the kind of accident that a self-driving automobile would have prevented, performed the same as the humans, or potentially worse. Anyway, as an experiment I watched all of the people driving by me at slow speeds because I was in the position to look straight into their windshield and observe them as drivers. I noticed how many people were either holding a phone in their right hand or looking down periodically at the phone or whatever it might be that took their attention off the road in front of them. Approximately 40% of the drivers were either actively looking at a phone or their attention was being drawn away from the road by something (could have been the radio, could have been their soda can, could have been a cell phone that was below the dashboard in such a way I couldn't see it). So it's bad that these people are already driving vehicles while juggling other distractions to their attention. But it's a fair bet they will become even more distracted if any part of their task is being taken over by an autonomous system. So it becomes very important that the autonomous system does not depend on them almost at all. Is that possible with the present level of technology?
And are we demanding - by we I mean absolutely anyone who has oversight over these decisions - that companies that manufacture the vehicles reveal the basis for their decision making about how the car is supposed to make "choices" that potentially impact life-or-death scenarios? Somewhere a human being or a group of human beings made a bunch of decisions about what was right and wrong for the car to do and programmed that into what the car is capable of. Given that I almost totally reject utilitarianism as a basis for ethics and find that most people are completely ass-backwards in the whole realm of moral reasoning, that's a problem for me.
But either way the point is not that a person would have done better; the point is that one of the most important reasons we are to accept the cars is that they do better than people. But do they? People are not only poor drivers, but they are notoriously poor monitors of autonomous systems like these self-driving cars. These are things the military, for instance, has had to learn how to counteract, because of difficulties with the way humans pay attention to things, or not. If there is any role for a human monitor, then we have to address the two forks of that problem: minimum monitoring goes along with minimal attention paid to the monitoring task, and maximum monitoring goes along with confusion over when the autonomous system is responsible and when the human is supposed to take a turn at controlling it.
Tangentially, I was stuck for a very long time in a traffic jam caused by an accident today. I don't know whether it was the kind of accident that a self-driving automobile would have prevented, performed the same as the humans, or potentially worse. Anyway, as an experiment I watched all of the people driving by me at slow speeds because I was in the position to look straight into their windshield and observe them as drivers. I noticed how many people were either holding a phone in their right hand or looking down periodically at the phone or whatever it might be that took their attention off the road in front of them. Approximately 40% of the drivers were either actively looking at a phone or their attention was being drawn away from the road by something (could have been the radio, could have been their soda can, could have been a cell phone that was below the dashboard in such a way I couldn't see it). So it's bad that these people are already driving vehicles while juggling other distractions to their attention. But it's a fair bet they will become even more distracted if any part of their task is being taken over by an autonomous system. So it becomes very important that the autonomous system does not depend on them almost at all. Is that possible with the present level of technology?
And are we demanding - by we I mean absolutely anyone who has oversight over these decisions - that companies that manufacture the vehicles reveal the basis for their decision making about how the car is supposed to make "choices" that potentially impact life-or-death scenarios? Somewhere a human being or a group of human beings made a bunch of decisions about what was right and wrong for the car to do and programmed that into what the car is capable of. Given that I almost totally reject utilitarianism as a basis for ethics and find that most people are completely ass-backwards in the whole realm of moral reasoning, that's a problem for me.
- Tahlvin
- Scottish Joker
- Posts: 5397
- Joined: Mon Apr 11, 2016 7:31 pm [phpBB Debug] PHP Warning: in file [ROOT]/vendor/twig/twig/lib/Twig/Extension/Core.php on line 1236: count(): Parameter must be an array or an object that implements Countable
Re: Self-Driving Cars
From the latest I've heard, the Uber car had fewer sensors on it than many other self-driving cars. One of the advantages of self-driving technology and the use of lidar and all that other stuff, is that darkness is not supposed to be a factor. If you ask me, Uber screwed the pooch on this one, and the human monitor could have done little to avoid this accident. Yes, the chances may have been a little higher had they been paying attention, but if I was assign blame, I'd put it to be 75% on the technology and 25% on the human monitor.
Wash: "This is gonna get pretty interesting."
Mal: "Define interesting."
Wash: "Oh, God, oh, God, we're all gonna die?"
Mal: "Define interesting."
Wash: "Oh, God, oh, God, we're all gonna die?"
Who is online
Users browsing this forum: No registered users and 91 guests