It broke Asimov's First Law. It should be scrapped.
It laughed. Menacingly.
Just read I robot. Great book. But it creates a contradiction or problem. What if the car has a choice to kill you, like drive off a cliff, or kill a pedestrian? Or many pedestrians? How do we program these cars? There was a survey done and most people want their car to put them first. lol. But it raises interesting ethical questions. There are situations where humans will be harmed no matter the choice. Should your car kill you to save school children or other people? How do we measure this? The needs of the many outweigh the few? Whatever has lesser casualties will be picked? I don't know.
Uber has suspended testing btw.
Just read I robot. Great book. But it creates a contradiction or problem. What if the car has a choice to kill you, like drive off a cliff, or kill a pedestrian? Or many pedestrians? How do we program these cars? There was a survey done and most people want their car to put them first. lol. But it raises interesting ethical questions. There are situations where humans will be harmed no matter the choice. Should your car kill you to save school children or other people? How do we measure this? The needs of the many outweigh the few? Whatever has lesser casualties will be picked? I don't know.
Uber has suspended testing btw.
I still strongly feel self driving cars already are, and with more work will be even greater so, safer than human drivers both for the occupants and the others around
But stories like this suck. And get vast attention outside of their percentage of serious injuries relative to human drivers. And hopefully it is used to drive the autonomous auto market to become even more and more consistently safe rather than put people off of development
The difference being when a person's operating the vehicle there is someone to hold accountable. What do you do when it's in autonomous mode? Crush the car and say sorry about your luck?It's not good as the news will pick up on this (as will apparently this forum).
I'd be interested to know the number of deaths per kilometers driven autonomously v deaths per kilometers driven traditionally. One incident is an outlier and a lot more data would be needed but this would be the way to figure out if this is a big problem or not.
Not much evidence of that
National average in America = 1.25 deaths per 100 million miles driven
Uber has reportedly driven only 2 million miles
I believe It will always try to avoid The Pedestrian. the driver/passenger is in a 2000 lb piece of metal with multiple safety features compared to a bag of Flesh and Bones.
The difference being when a person's operating the vehicle there is someone to hold accountable. What do you do when it's in autonomous mode? Crush the car and say sorry about your luck?
- Tempe police are investigating a deadly crash involving a self-driving Uber vehicle overnight.
The Uber vehicle was reportedly driving early Monday when a woman walking outside of the crosswalk was struck.
The woman was taken to the hospital where she died from her injuries.
Tempe Police says the vehicle was in autonomous mode at the time of the crash and a vehicle operator was
also behind the wheel.
https://www.abc15.com/news/arizona-...-driving-uber-car-involved-in-crash-overnight
Bad news for Uber regardless.
Good point. The cars cameras may not have even seen the biker. I actually have thought about this a couple of times when I couldn't see shit out of my backup camera. I'm like umm what happens if this car was supposed to drive itself right now?
Obviously they are going to be better cameras, but at the end of the day it's still going to be small cameras and those can get mud and water etc on themYep, that's how far 100s of millions of dollars in research of development of autonomous driving have gotten us. The same experience as you looking through your backup camera.