Self driving Uber car kills human.

The article said the bitch was jaywalking. She probably walked right out in front of the car
 
It broke Asimov's First Law. It should be scrapped.


kMcIqGH.gif



It laughed. Menacingly.


Just read I robot. Great book. But it creates a contradiction or problem. What if the car has a choice to kill you, like drive off a cliff, or kill a pedestrian? Or many pedestrians? How do we program these cars? There was a survey done and most people want their car to put them first. lol. But it raises interesting ethical questions. There are situations where humans will be harmed no matter the choice. Should your car kill you to save school children or other people? How do we measure this? The needs of the many outweigh the few? Whatever has lesser casualties will be picked? I don't know.

Uber has suspended testing btw.
 
Last edited:
Just read I robot. Great book. But it creates a contradiction or problem. What if the car has a choice to kill you, like drive off a cliff, or kill a pedestrian? Or many pedestrians? How do we program these cars? There was a survey done and most people want their car to put them first. lol. But it raises interesting ethical questions. There are situations where humans will be harmed no matter the choice. Should your car kill you to save school children or other people? How do we measure this? The needs of the many outweigh the few? Whatever has lesser casualties will be picked? I don't know.

Uber has suspended testing btw.

Personally, I think Asimov's laws are outdated, and I think Skynet would agree. I say kill them all.

I saw that they suspended testing.
 
Just read I robot. Great book. But it creates a contradiction or problem. What if the car has a choice to kill you, like drive off a cliff, or kill a pedestrian? Or many pedestrians? How do we program these cars? There was a survey done and most people want their car to put them first. lol. But it raises interesting ethical questions. There are situations where humans will be harmed no matter the choice. Should your car kill you to save school children or other people? How do we measure this? The needs of the many outweigh the few? Whatever has lesser casualties will be picked? I don't know.

Uber has suspended testing btw.

I believe It will always try to avoid The Pedestrian. the driver/passenger is in a 2000 lb piece of metal with multiple safety features compared to a bag of Flesh and Bones.
 
I still strongly feel self driving cars already are, and with more work will be even greater so, safer than human drivers both for the occupants and the others around

But stories like this suck. And get vast attention outside of their percentage of serious injuries relative to human drivers. And hopefully it is used to drive the autonomous auto market to become even more and more consistently safe rather than put people off of development

Not much evidence of that

National average in America = 1.25 deaths per 100 million miles driven
Uber has reportedly driven only 2 million miles
 
It's not good as the news will pick up on this (as will apparently this forum).

I'd be interested to know the number of deaths per kilometers driven autonomously v deaths per kilometers driven traditionally. One incident is an outlier and a lot more data would be needed but this would be the way to figure out if this is a big problem or not.
The difference being when a person's operating the vehicle there is someone to hold accountable. What do you do when it's in autonomous mode? Crush the car and say sorry about your luck?
 
Not much evidence of that

National average in America = 1.25 deaths per 100 million miles driven
Uber has reportedly driven only 2 million miles

Thank you for finding these numbers. I'll be very interested in keeping track of them

I was basing my opinion on an article I read a couple year's ago about Google's 3 million miles driven and Waymo's 4 million and them involved in no major accidents, and most of the fender bender's being the fault of the human drivers hitting it

Could be a difference in tech and capabilities in the companies? Could be that this was the 1 serious incident and they'll go hundreds of miles without one now. With the rate I hear about deadly car crashes just on the highways around my 70,000 pop town I would have thought the fatality rate was significantly higher than it apparently is. Your numbers check out and I learned something today. Thank you, and with this sad news about Uber's fatality it seems even these less than 1 in a million mistakes (unavoidable situations?) from a computer is worse than a human can do.
 
I believe It will always try to avoid The Pedestrian. the driver/passenger is in a 2000 lb piece of metal with multiple safety features compared to a bag of Flesh and Bones.

There are 8 billion flesh bags walking the earth. We cant blame the machines for hating us.
 
There will always be casualties but as long as it isn't anymore dangerous than manual driving then it shouldn't be an issue. Hell it might be even worth accepting a slightly higher casualty rate for greater convenience, we do this all the time e.g. we could save a couple more lives by reducing the speed limit but we don't because it would be massively inconvenient to the majority.
 
The difference being when a person's operating the vehicle there is someone to hold accountable. What do you do when it's in autonomous mode? Crush the car and say sorry about your luck?

I'm not sure about laws in the US but in Australia you are required to remain behind the wheel and in control of the car. So although it is 'autonomous' at the time you must be there with your hands basically on the wheel and in control.

I would imagine that if the law evolves it could be a liability issue for Tesla.
 
If self-driving vehicles kill enough people we won't have as large an unemployment problem when self-driving vehicles put millions out of work. So there is an upside.
 
- Tempe police are investigating a deadly crash involving a self-driving Uber vehicle overnight.

The Uber vehicle was reportedly driving early Monday when a woman walking outside of the crosswalk was struck.

The woman was taken to the hospital where she died from her injuries.

Tempe Police says the vehicle was in autonomous mode at the time of the crash and a vehicle operator was
also behind the wheel.

https://www.abc15.com/news/arizona-...-driving-uber-car-involved-in-crash-overnight


Bad news for Uber regardless.

Sounds like 50 points.
 
Good point. The cars cameras may not have even seen the biker. I actually have thought about this a couple of times when I couldn't see shit out of my backup camera. I'm like umm what happens if this car was supposed to drive itself right now?

Yep, that's how far 100s of millions of dollars in research of development of autonomous driving have gotten us. The same experience as you looking through your backup camera.
 
Yep, that's how far 100s of millions of dollars in research of development of autonomous driving have gotten us. The same experience as you looking through your backup camera.
Obviously they are going to be better cameras, but at the end of the day it's still going to be small cameras and those can get mud and water etc on them
 
cant wait until they say the bicycle malfunctioned and the car was not at fault.
 
Back
Top