WHY PEOPLE KEEP REAR-ENDING SELF-DRIVING CARS

POWER2

My name is Power²
@Silver
Joined
Sep 5, 2016
Messages
11,527
Reaction score
1,857
SEE CHARTS FOR ARTICLE HERE: https://www.wired.com/story/self-driving-car-crashes-rear-endings-why-charts-statistics/

Waymo_PMMXEB.jpg

Self-driving cars have been involved in nearly 50 crashes so far in 2018, in California alone. Why are so many of them rear-ended?
ANDREI STANESCU/ALAMY
The self-driving car crashes that usually make the news are, unsurprisingly, either big and smashy or new and curious. The Apple that got bumped while merging into traffic. The Waymo van that got t-boned. And of course, the Uber that hit and killed a woman crossing the street in Tempe, Arizona in March.

Look at every robo-car crash report filed in California, though, and you get a more mundane picture—but one that reveals a striking pattern. In September of this year, for example, three self-driving cars were side-swiped. Another three were rear-ended. One of them by a bicycle. And that’s not even the strangest one: In June, an AV operated by General Motors’ self-driving arm Cruise got bumped in the back—by a human driving another Cruise.

The people developing self-driving cars pitch them as a tool for drastically reducing the nearly 40,000 fatalities that hit US roads every year. Getting there will take years at least, decades probably, and that means a lot more time spent testing on public roads. And so these sorts crashes raise a few questions: What’s the best way to handle what could become a nationwide experiment in robotics and AI, where the public participants haven’t willingly signed on, and the worst case scenario is death?

We don’t have the answers. But chipping away at these questions starts with understanding the problem. And that means looking at the data.

Data that’s quite limited in terms of what’s public. These are companies in a competitive field, and they don’t voluntarily share much in the way of details. They only invite the press or public officials into their vehicles in tightly controlled situations where they perform well. And anecdotal evidence of weaknesses—like The Information’s report that Waymo cars have trouble with left hand turns into traffic and frustrate human drivers—is well, anecdotal.

Of the states where most AV developers do their on-road testing—California, Arizona, Nevada, Pennsylvania, and Michigan—only the Golden State requires companies publicly report details about their programs. Once a year, they must submit a report to the DMV explaining how many miles they’ve driven and how often the human sitting in the car took the wheel. Any time one of their cars is in any sort of collision, no matter how minor, the developer must submit a collision report explaining what happened within 10 business days.

Since these regulations took effect in 2014, the DMV has received (and published) 104 autonomous vehicle collision reports, including 49 so far in 2018, as more and more cars hit the streets. Most crashes are minor, few are newsworthy. But taken together, they present a picture of how these tests are progressing, and how well robots are sharing the road. And they hint at a conclusion similar to what anecdotal evidence suggests—that these vehicles drive in ways humans might not expect, and might not want them to.

As this chart shows, in 2018 so far, companies have filed 49 reports. GM’s Cruise has filed by far the most, but don’t read too much into that. If the pattern holds from 2016 to 2017 (we won’t have 2018 numbers until early 2019) Waymo has been dialing down its testing in California, in favor of Arizona. Cruise has been ramping it up, and does its driving in the chaos of San Francisco. Waymo has the second most collisions, followed by Zoox, a startup that also tests in the city.

These reports, written and filed by the companies running the cars, consist mostly of checkboxes, with a line or two explaining what happened. Some detail thankfully freaky, presumably rare incidents (“The Cruise AV was struck by a golf ball from a nearby golf course.”) Some reveal what we’ll call exasperation on the part of other road users (“The driver of the taxi exited his vehicle, approached the Cruise AV, and slapped the front passenger window, causing a scratch.”)

Other sorts of crashes happen more frequently.

Drilling down into the data shows that autonomous vehicles being rear-ended accounts for 28 of the 49 filed reports, nearly two-thirds. Next is sideswipe collisions, with pedestrians, hitting objects, and “other” all trailing behind. (These categories are provided in checkboxes on the DMV report form. The two pedestrian impacts reported are people coming up to and hitting cars.)

So let’s look at those rear-end crashes. Under state law, if someone hits you from behind, it’s their fault. And yes, today’s drivers are dangerously distracted, and no, it doesn’t take much of a mistake to knock into somebody in stop and go traffic.

But combine that with the fact that the computer was in charge in 22 of those 28 rear-end crashes, and you have reason to believe that the AVs are doing something that makes cars behind them more likely to hit them. Maybe that’s driving herkily-jerkily (as we experienced in a Cruise car in San Francisco in November), or stopping for no clear reason (as we experienced in an Uber car in Pittsburgh last year). That’s not a bad thing, necessarily. It indicates a conservative focus on safety: Better to stop for a fire hydrant than run down a preschooler. But part of being a good driver is behaving in a way others expect, which doesn’t include constantly stamping on the brakes.

The runner-up in terms of crash types is sideswipes, many of which appear to involve human drivers frustrated at getting stuck behind a slow or stopped AV, trying to overtake it, and not quite making it.

It’s not possible to say definitively that the AVs are driving like jerks, because the reports don’t ask the companies what percentage of miles the cars have covered in each mode. It could be there are fewer crashes in manual mode because the cars are hardly ever in that mode. “They’re autonomous vehicles, they spend most of their time being autonomous,” says Kyle Vogt, co-founder and CEO at Cruise, General Motors’ self driving car division. But Cruise declined a request to give exact numbers.

Cruise is highly represented in these reports, because of the size of its fleet in San Francisco. That city is tough even for an experienced human driver to navigate, a land of knotty intersections, trolleys, cyclists, pedestrians, road works, steep hills, and aggressive drivers. Cruise says that helps it learn much faster than it does on the relatively simple, boring, streets of Arizona, where it also tests. It says its cars encounter emergency vehicles 46 times more frequently in SF than in the Phoenix suburbs, and construction 39 times more often.

Researchers agree that AVs won’t get to a point where they can make driving safer without testing on the public roads. But that brings up other questions, says Matthew Johnson-Roberson, who co-directs the Ford Center for Autonomous Vehicles at the University of Michigan. “Should they all be allowed to be on public roads before passing some level of baseline performance?” he says. California requires companies apply for the right to test AVs in public, but that doesn’t involve any kind of exam. “My personal advice is to treat the vehicles incredibly cautiously.”

LEARN MORE
guides_carve_car_160.jpg

THE WIRED GUIDE TO SELF-DRIVING CARS

Vogt says the California crash reports make clear that humans expect other humans to bend or break traffic rules, rolling through four-way intersections, accelerating to make a yellow light, or cruising over the speed limit. But his robots won’t follow suit.

“We’re not going to make vehicles that break laws just to do things like a human would,” he says. “If drivers are aware of fact that AVs are being lawful, and that’s fundamentally a good thing because it’s going to lead to safer roads, then I think there may be a better interaction between humans and AVs.”

So maybe the key there is awareness. The public would benefit from knowing more about these vehicles, how they work, how they’re tested, and how they’re likely to behave. That takes companies communicating openly and honestly about how development is going, and how capable their cars are, rather than releasing their usual fare: glossy, edited videos, or PR documents showing their tech at its best.

Short of requiring some sort of test, one easy change could be basic standards for how these vehicles are marked, alerting other road users to how they usually drive. Think of the stickers many countries require newer drivers display in their vehicles (like the “L” for learners in the UK). Or the signs like on vans and trucks saying “this vehicle makes frequent stops,” “this vehicle stops at all railroad crossings,” and “does not turn right on red.” The big tech companies might not like this kind of messaging, but it could help the other folks on the road adapt to their new robotic friends.

Self-driving cars aren’t necessarily going to be worse than your standard teenager, but they will certainly do things that humans wouldn’t. So if you do encounter one, don’t get distracted by your phone when you’re behind it. Give it a lot of stopping distance. Expect to see something weird. And hope to get a ride someday.
 
Dude nobody isn't going to read all of that. that shit's longer than my even my OPs
 
Self-driving cars have been involved in nearly 50 crashes so far in 2018

That seems incredibly low. This experiment seems to pretty much be working so far.
 
And so these sorts crashes raise a few questions: What’s the best way to handle what could become a nationwide experiment in robotics and AI, where the public participants haven’t willingly signed on, and the worst case scenario is death?

Indeed
 
But everyone here at sherdog told me self driving cars are the future and we wont have a choice. People dont seem to realize technology glitches and crashes all the time and to trust your life to it is a big mistake. It wont be until the roads are redone with sensors and other things that autonomous vehicles will ever be a reality. To expect them to work in the world the way it is now is asking for problems.
 
That seems incredibly low. This experiment seems to pretty much be working so far.

On top of that 28 of them were people rear ending the vehicle which means it's the other persons fault. You should always be leaving enough room to stop at any point behind another car. Including a random unexpected stop.

The sideswipes seem to be from people trying to get ahead of the car for going the speed limit as well. So basically people continue to cause wrecks because they can't follow basic driving laws\tips.
 
Dude nobody isn't going to read all of that. that shit's longer than my even my OPs

Agreed. But people read end everybody, including self driving cars because nobody follows the 2 second rule. Everyone thinks they’re special because they are such a good driver that the rules don’t apply to them. And then they read end people.
 
Because jackasses dont know how to drive

In a 30 mph zone, you dont go 40 you fucks
 
But everyone here at sherdog told me self driving cars are the future and we wont have a choice. People dont seem to realize technology glitches and crashes all the time and to trust your life to it is a big mistake. It wont be until the roads are redone with sensors and other things that autonomous vehicles will ever be a reality. To expect them to work in the world the way it is now is asking for problems.
this points the fault at the human drivers because the self driving cars are being rear ended. You have an illogical stubbornness regarding this issue. It's currently far from perfect but still exponentially safer than other human drivers.
 
this points the fault at the human drivers because the self driving cars are being rear ended. You have an illogical stubbornness regarding this issue. It's currently far from perfect but still exponentially safer than other human drivers.
And you clearly didnt read my whole post. I see a solution just not with the current infrastructure we have now. But no Im just stubborn. You chose to read my reply in one way when I clearly left the option open. Your the one being illogically stubborn.
 
I have a 90 minutes commute, and have seen enough of humans driving. They’re terrible at it. Bring on the self- driving cars.
 
And you clearly didnt read my whole post. I see a solution just not with the current infrastructure we have now. But no Im just stubborn. You chose to read my reply in one way when I clearly left the option open. Your the one being illogically stubborn.

The problem with your statement is that the industry as in the people actually developing this technology completely disagree. So who's word should we take random sherdogger not involved in this industry\technology at all or the people actually working in this field?

I have a 90 minutes commute, and have seen enough of humans driving. They’re terrible at it. Bring on the self- driving cars.

Yep I've seen the same during my 90 minutes commute which thank god I don't have to do anymore. I've been paying $10 for like 6 months cause I don't even want to drive to the gym out there to cancel it.
 
Most people are awful drivers. Makes sense most of the accidents are the cars getting rear ended. A self driving car isn't going to react the same way as a person when they get tailgated. Self driving car doesnt give a fuck, it's not gonna speed up.
 
I wonder if people get distracted by the fact that they see a self-driving car and forget to pay attention.
 
And you clearly didnt read my whole post. I see a solution just not with the current infrastructure we have now. But no Im just stubborn. You chose to read my reply in one way when I clearly left the option open. Your the one being illogically stubborn.
My point is that even with the current infrastructure it is safer- the danger as always is the unpredictability of human drivers. And I say this as someone who loves driving- it is absurd for society to revolve around an activity that has such a high fatality rate.
 
My point is that even with the current infrastructure it is safer- the danger as always is the unpredictability of human drivers. And I say this as someone who loves driving- it is absurd for society to revolve around an activity that has such a high fatality rate.

Agreed way to many people die in auto accidents. I don't really enjoy driving but I understand that people do love it. At the end of the day it will be completely illegal to drive manually (other than emergencies I'm sure) and that's the way it should be.
 
I agree it's risky to release T1000 on the street for training but your statistically more likely to die by flying swordfish impalement.

Let em learn in small scale testing... I don't think they will ever get a good enough AI to be used en masse.
 
Agreed way to many people die in auto accidents. I don't really enjoy driving but I understand that people do love it. At the end of the day it will be completely illegal to drive manually (other than emergencies I'm sure) and that's the way it should be.
within a hundred years, we will look back on the 20th century and find it comical how society found it acceptable for everyone to operate 2 ton steel machines with the capability to kill a crowd. It will seem even more anachronistic than horse and carriage do to us. And like horses, it will only exist as a hobby for the rich.
 
Back
Top