Uber self-driving car kills pedestrian in first fatal autonomous crash

Exactly. Even before cell phones they mess with the radio, they read books, they put on make up. Humans are imperfect.

I used to drink beer, roll joints and fool around with the ol' lady goin' down the road...

Now every busybody will call and turn you in....
 
The same way everybody else knows that any and every human would certainly stopped in time, I suppose.

I have not seen one single person say anyone could have stopped in time. What are you reading and why are you coming here to bitch about it?
 
Unlikely anyone's reaction time was enough to swerve in time.

THE DRIVER IS RESPONSIBLE. From my earliest driver training.

The driver is always prepared to stop.. at any obstacle..

"Failure to maintain control" is a common charge.

AI was driving,, but it will be blamed on the schmuck that signed up,,cuz he's a felon.
 
Sounds good in theory, doesn't it?



The thing is, the radar is liable to be optimized for spotting certain things like curbs, rather than spotting things moving into the path of the vehicle, as demonstrated by where the units are mounted in that diagram. And isn't optimal for spotting soft tissue like human clothing and flesh, which is pretty stealthy. Lidar requires a reflective thing to bounce laser beams off of, which black clothing is not, and I believe it sends pulses rather than a continual sweep. And as I mentioned, any camera which is optimized to make a picture that looks good is not optimized to show objects which are either underexposed or overexposed compared to the bulk of the surroundings--which means that specialized cameras need to be developed, as all existing cameras are designed to make pretty pictures.

Add in the fact that they don't seem interested in adding sound detectors and they certainly aren't stuffing IBM's Watson in the trunk of the vehicle, and I think it's safe to say these things are blind, deaf and dumb--especially at night.


A well designed system can detect more than the human eye can. Obviously what is currently being offered to the public is not up to prime time.

From what we have seen so far appears to be lack such additions as infrared and "night vision" type cameras.
 
http://ideas.4brad.com/it-certainly-looks-bad-uber

excerpts:
The road is empty of other cars. Here are the big issues:
1.On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not earlier. She would have come into range just over 5 seconds before impact.
2.On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
3.The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
4.While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)

To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event.
Much more indepth write-up at the link
 
To be clear, while the car had the right-of-way and the victim was clearly unwise to cross there, especially without checking regularly in the direction of traffic, this is a situation where any properly operating robocar following "good practices," let alone "best practices," should have avoided the accident regardless of pedestrian error. That would not be true if the pedestrian were crossing the other way, moving immediately into the right lane from the right sidewalk. In that case no technique could have avoided the event.

Not using braking alone. But a car can change lanes in far, far less distance than is required to stop. Both those distances increase with speed, but the basic fact holds true at anything above jogging speed.
 
Going north by Marquee Theater to show how misleading the video from the Uber vehicle was. Point of accident is the darker area I'm reaching at about 33 seconds in.

 
Disgraceful Dashcam Video Proves Uber Is the Theranos of Self-Driving
This much is clear: Uber is culpable in the death of Elaine Herzberg.

...Two weeks ago, The Drive published "The Human Driving Manifesto," in which I claimed there was absolutely no evidence self-driving cars were safer than humans—at least not yet—and that we have a moral obligation to improve human driving safety even regardless.

Little did I know how prescient that would turn out to be.

Yesterday I wrote "Elaine Herzberg's Death Isn't Uber's Tragedy. It's Ours," in which I called out the hypocrisy of a country that tolerates 100 deaths by human drivers a day, but won't tolerate one by machine. I was referring, of course, to the tragic death of Elaine Herzberg, who was struck and killed by a self-driving Uber test vehicle this past Sunday in Tempe, Arizona, just one of ten pedestrians killed in that state last week.

I was trying to give Uber the benefit of the doubt. I was wrong.

Not only was I wrong, but The Human Driving Manifesto—which I jokingly wrote in response to the ever increasing storm of self-driving clickbait—was more accurate than I ever could have guessed, because now that the Tempe police have release dashcam footage of the fatal crash, all of the following points are perfectly clear:

Uber is guilty of killing Elaine Herzberg.
Uber's hardware and/or software failed.
Many people at Uber need to be fired.
The Arizona officials who greenlit testing need to resign.
One or more people need to be prosecuted.
The SAE Automation Classification System is vague and unsafe.
Uber is the Theranos of self-driving.
Volvo—one of the few car makers that truly cares about safety—is innocent and shouldn't be in bed with their craven opposites.
Even if you believe self-driving cars may someday reduce road fatalities—and I do believe that—this dashcam video is an icepick in the face of the argument that anyone at Uber gives a damn about anyone's safety, including that of their own test drivers.

I've long suspected that 99% of claims from self-driving companies were BS, but I didn't think it was this bad:
...
A slow moving pedestrian at night—well beyond human line of sight—is precisely what radar and Lidar sensors are supposed to see. This is precisely the type of crash self-driving cars are designed to prevent.
...
What is the purpose of a safety driver? To take control—whether it's steering or braking—in order to prevent an impact the self-driving car cannot. That didn't happen here. Why not? Partially because it was at night and the headlights may not have illuminated Herzberg until it was too late, and partially because the safety driver wasn't paying attention. The safety driver doesn't appear to have applied the brakes until after the impact, further indicating lack of readiness. I'm not convinced this particular "safety" driver could have done better even in daylight. Her eyes are glued to whatever device is in her hand.

The safety driver certainly bears some moral responsibility, and depending on the nature of her employment contract, she may bear some legal responsibility as well.

And that's before we know anything about what kind of training, if any, Uber gives its "safety" drivers.

Oh, did I mentioned that the driver had a history of traffic violations dating back to 1998? And that Uber claimed she passed all background checks? Uber, you've got a minimum standard problem.
...

More: http://www.thedrive.com/opinion/195...o-proves-uber-is-the-theranos-of-self-driving
 
Last edited:
Yep. The Police and Uber are sticking to their story. It was "unavoidable"...


I don't know anything about self driving cars or any details of the case but I just watched the dashcam video. It sure looks like it was the pedestrian's fault. Do you disagree? Is there something I'm missing?

The guy is wearing black at night and jaywalking. There have been multiple times where I have almost hit cyclists at night. The only person I would feel bad for in that situation is me, because I would have been punished because some idiot has terrible judgment biking on the side of the road at night.
 
Last edited:
Nobody is saying the pedestrian is not at fault. but does that mean you shouldn't try to avoid one if they get in your way at night?
 
Nobody is saying the pedestrian is not at fault. but does that mean you shouldn't try to avoid one if they get in your way at night?


Sure. Of course you avoid hitting someone otherwise it is vehicular manslaughter and you go to jail. My point is people have their lives ruined because cyclists are careless. I was going around a curve at probably 8 or 9 at night and I missed smacking a guy by a foot. I can tell you the only person I started to feel bad about was myself and what would have happened to me.
 
I don't know anything about self driving cars or any details of the case but I just watched the dashcam video. It sure looks like it was the pedestrian's fault. Do you disagree? Is there something I'm missing?

The guy is wearing black at night and jaywalking. There have been multiple times where I have almost hit cyclists at night. The only person I would feel bad for in that situation is me, because I would have been punished because some idiot has terrible judgment biking on the side of the road at night.

Did you watch the video that I quoted? Same street at night, filmed with a more realistic camera. The Police and/or Uber provided a very dark, low quality video. Why would they do that?
 
Did you watch the video that I quoted? Same street at night, filmed with a more realistic camera. The Police and/or Uber provided a very dark, low quality video. Why would they do that?

They couldn't afford an HD camera with color and a better light sensitivity.:cool:
 
I know an expert in the field (competitor of Uber using a lot of the same technology), and he says there was a failure, the car should have detected the pedestrian. Also a failure on the part of the safety driver. He is not impressed with Uber, and thinks they cut corners and exaggerate. Seems to be quite a race to be the first in production wth autonomous vehicles.

Something else to think about though: if the safety driver had prevented the accident, we would never hear about it. Think of how many times the safety drivers have prevented accidents? Even beyond that, imagine how many times the driver prevented an accident, yet the company just basically says "the driver didn't need to intervene, the car would have avoided that if left alone."

In other words, this fatal accident may just be the tip of the iceberg.
 
Uber car's 'safety' driver streamed TV show before fatal crash: police

https://www.reuters.com/article/us-...driving-car-crash-police-report-idUSKBN1JI0LB

JUNE 22, 2018

SAN FRANCISCO/WASHINGTON (Reuters) - The safety driver behind the wheel of a self-driving Uber car in Tempe, Arizona, was streaming a television show on her phone until about the time of a fatal crash, according to a police report that deemed the March 18 incident “entirely avoidable.”

A report by the Tempe Police Department said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up a half second before the car hit Elaine Herzberg, 49, who was crossing the street at night.

The report said police concluded the crash, which has dealt Uber Technologies Inc a major setback in its efforts to develop self-driving cars, would have been “entirely avoidable” if Vasquez had been paying attention.

Vasquez could face charges of vehicular manslaughter, according to the report, which was released late on Thursday in response to a public records request.

She could not immediately be reached for comment and Reuters could not locate her attorney.

Police obtained records from Hulu, an online service for streaming TV shows and movies, which showed Vasquez’s account was playing the TV talent show “The Voice” for about 42 minutes on the night of the crash, ending at 9:59 p.m., which “coincides with the approximate time of the collision,” the report said.

Police submitted their findings to local prosecutors, who will make a determination on whether to file criminal charges. The Maricopa County Attorney’s Office referred the case to the Yavapai County Attorney’s Office because of a conflict.

A spokeswoman for the Yavapai County Attorney’s Office said on Friday that “the matter is still pending review. We do not have a projected timeline for a decision.”

The Uber car was in autonomous mode at the time of the crash, but the company, like other self-driving car developers, requires a back-up driver inside to intervene when the autonomous system fails or a tricky driving situation occurs.

Vasquez looked up just 0.5 seconds before the crash, after keeping her head down for 5.3 seconds, the Tempe police report said. Uber’s self-driving Volvo SUV was traveling at just under 44 miles (71 km) per hour.

“We continue to cooperate fully with ongoing investigations while conducting our own internal safety review,” an Uber spokeswoman said. “We have a strict policy prohibiting mobile device usage for anyone operating our self-driving vehicles. We plan to share more on the changes we’ll make to our program soon.”

Last month, the Uber spokeswoman said the company was undergoing a “top-to-bottom safety review,” and had brought on a former U.S. federal transportation official to help improve its safety culture.

‘VERY SERIOUS CASE’
Police said a review of video from inside the Volvo showed Vasquez was looking down during the trip, and her face “appears to react and show a smirk or laugh at various points during the times that she is looking down.” The report found that Vasquez “was distracted and looking down” for close to seven of the nearly 22 minutes prior to the collision.

Tempe Police Detective Michael McCormick asked Hulu for help in the investigation, writing in a May 10 email to the company that “this is a very serious case where the charges of vehicle manslaughter may be charged, so correctly interpreting the information provided to us is crucial.” Hulu turned over the records on May 31.

According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.

Herzberg, who was homeless, was walking her bicycle across the street, outside of a crosswalk on a four-lane road, when she was struck by the front right side of the Volvo.

The police report faulted Herzberg for “unlawfully crossing the road at a location other than a marked crosswalk.”

In addition to the report, police released a slew of audio files of 911 calls made by Vasquez, who waited at the scene for police, and bystanders; photographs of Herzberg’s damaged bicycle and the Uber car; and videos from police officers’ body cameras that capture the minutes after the crash, including harrowing screams in the background.

Uber shuttered its autonomous car testing program in Arizona after the incident, and says it plans to begin testing elsewhere this summer, although in some cities it will have to first win over increasingly wary regulators.
 
Back
Top