Back to blog

Why Uber’s self-driving car killed a pedestrian

See blog

Readers' comments

The Economist welcomes your views. Please stay on topic and be respectful of other readers. Review our comments policy.

guest-leaejee

I expect that the Uber fatality was not the only pedestrian to die on America's roads that day. None of the several articles I have read on AV accidents has addressed what is, surely, one of the most critical concerns: are current AVs statistically more or less dangerous than the average driven car? AVs should now have logged enough test kilometers to provide preliminary data on fatalities per 1,000,000 miles or kilometers driven. Is the AV rate higher or lower than the rate for a normal human driven car?
.
If current AVs are statistically safer than driven cars then, tragic as any fatality is, they are to be preferred, even in their present imperfect state.

Antidot Nyarlat

AI in cars to support you? Yes. Self-Driving AI that makes you comfy and has you playing games or sleepiing when you should pay attention to your surroundings? NO!
We should never give up our role as actors and become just bystanders to fully autonomous systems. It will 100% lead to big mistakes and terrible accidents.
UBER is guilty of negligence and man-slaughter. At least TWO persons should have been in a test driving car! An engineer who has to read outputs on computer or mobile screens can't pay full attention to the street. This woman was another victim of Ubers greed. They didn't want to pay for two persons. Plain and simple. Profit over security is what companys strive for.

guest-aaawwwmj

News of the future....
.
2,500 S. Koreans killed in 5,000 accidents yesterday
North Korean hackers are suspected
.
.
News of the present...
Exclusive: U.S. drones hacked by Russia in Ukraine | Reuters.com
https://www.reuters.com/video/2016/12/21/exclusive-us-drones-hacked-by-r...
.
NSFTL
Regards

econobanker

"self-driving" is still in it's infancy but we treat it as if it works. If you read Tesla's drivers manual, it states "Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time". In other words keep your hands more or less on the wheel, watch the traffic, the rear view mirror etc and foot ready to brake AT ALL TIMES.

In other words, "self-driving", as it is today, is a dangerous illusion of safety and should not be allowed at all. This begs the question when will AVs be ready if we don't let potentially dangerous and fatal systems "loose" on the streets? AVs will be ready when Google, Tesla et al goes back to the lab and tinkers some more. It will take a lot longer than the companies want but it's better than Google/Tesla/Uber playing with our lives, and somehow convincing the public it's for our own good.

sikko6

Tesla auto-driving car crashed to parked police car!
Tesla by records is the most dangerous car company.
If you want to commit suicide, drive Tesla.
You will have good chance to die violently!

Antidot Nyarlat in reply to sikko6

That's nonsense. There are much more conventional accidents caused by defunct or faulty parts in normal fossil fuel driven cars. You remember how often big players in the car market have to recall cars!? There have been many deaths from breaks and other parts that didn't work. Stop the hyperbole.

guest-theritz

WUI - Walking under the influence - is a major cause of pedestrian deaths.

According to the CDC

Drivers and pedestrians who are alcohol-impaired

Almost half (48%) of crashes that resulted in pedestrian deaths involved alcohol for the driver or the pedestrian. One in every three (34%) of fatal pedestrian crashes involved a pedestrian with a blood alcohol concentration (BAC) of at least 0.08 grams per deciliter (g/dL) and 15% involved a driver with a BAC of at least 0.08 g/dL.1

tman9999

Why not just add an alarm that indicates driver intervention is needed. Add color-coded visual indicators showing where the threat/risk is (front, back, left, right), and how imminent it is. Red for must act now. Yellow for must act within 5 seconds.

WT Economist

" Although it was dark, the car’s radar and LIDAR detected her six seconds before the crash. But the perception system got confused: it classified her as an unknown object, then as a vehicle and finally as a bicycle, whose path it could not predict."
.
If the perception system gave priority to not slowing down, then it was immoral.
.
When I see something the confuses me -- what is this and what will this person/thing do? -- I slow down pre-emptively. If the vehicle had slowed, even without going into emergency, that 1.3 seconds after the pedestrian was detected could have been 2.5 seconds, with a slow speed at that point. That is why most human drivers would not have hit her.
.
"Firstly we must question why the woman who was killed seemed perfectly happy to step into the path of an oncoming vehicle."
.
The initial video put out was highly misleading. It made it look as if she darted into the middle of a highway in a place with no lights. In fact this is a well lighted, heavily used crossing.
.
https://www.youtube.com/watch?v=CRW0q8i3u6E

umghhh

How does the actual statistics for self driving cars look like? This date should be public available I think. Can we compare that to what humans do? If the rumours about Uber self driving cars taking part in accidents more often than human driven one then some questions should be answered esp. if as it appears in none of these accidents the fault was attributed to the machine - I wonder what does that says about the whole process.

CA-Oxonian

Firstly we must question why the woman who was killed seemed perfectly happy to step into the path of an oncoming vehicle; perhaps she thought her bicycle would ensure that no harm could ever come to her under and circumstance whatsoever, this being an extreme example of the "I have a bike, nothing can harm me no matter what I do!" phenomenon.

Secondly we should note that on the same die she died, nearly one hundred other people were killed by clueless human drivers doing stupid things.

So the moral of the tale appears to be: as a species we're perfectly happy being killed and maimed in huge numbers by incompetent drivers but we expect absolute perfection from autonomous vehicles.

Yet logic shows that if autonomous vehicles were only 10% better than humans, on average, we'd save thousands of lives per year. Regardless of media sensationalism and the inevitable inability of humans to deal adequately with unrepresentative anecdotes we should remember not to make the perfect the enemy of the very good.

HappyHubris

While it is important to make self-driving platforms as safe as possible, they should not be expected to have a perfect record . Human drivers are notoriously err-prone and there are over 100 road fatalities on the average American day. Delaying the use of driving automation would be making the perfect the enemy of the good.

notbyintent

Yeah. The perception module is built on classifiers that have the long-tail problem. Object recognition is still a partially solved problem.

Kremilek2

I think that it really could be that other developers of self-driving cars have better designed systems that would avoid such a collision. It is hard to understand why the system didn't react faster and didn't start to brake automatically. One would guess that braking is better than a possible collision.

Rulus

In my opinion self-driving cars should never be allowed in general use streets and highways. They would work in exclusive self-driving car exclusive use areas such as downtown areas where no other man driven vehicles would circulate. A good example of why this is the case is the experience with the General Aviation decline in the 80´s and the 90´s due to product liability costs. The big incumbent corporations behind the AV industry are a much attractive target to sue in case of accidents and despite the supposedly decrease in accidents promised the the liabilities and moral issues should be unsurmountable.
And in addition, the whole system is vulnerable to hacking and or monkey wrenching with fatal consequences.

guest-aaawwwmj

They haven't even tested them in winter.
.
Black ice is probably hard for a computer to spot.
.
Will the car know which way to turn the steering wheel when the car fishtails on snow/ice?
.
NSFTL
Regards

Mtgolfer in reply to guest-aaawwwmj

The self-driving car would take into account temperature, humidity, dew point, etc to gauge the possibility of black ice.

Humans tend to learn little from their own mistakes and just about nothing from the errors of others. Why do you think the same stretches of road see collisions when it rains, snows, etc.?

If autonomous vehicles were to develop no more than they have to date, they are and will be safer than the huge majority of human drivers............and traffic would move more rapidly without the erratic moves of impaired, distracted, and just plain stupid drivers.

guest-aaawwwmj in reply to notbyintent

Perhaps.
.
Speaking of video games, perhaps hackers could hack the vehicle's program and set it off on a Death Race.
.
Arcade Game: Death Race
http://www.youtube.com/watch?v=aBBtt72aJLA
.
It's the game where you the driver chase after pedestrians.
When you run on over it turns into a cross, and you get points.
.
.
You probably have never watched the 1999 "Outer Limits" episode called "The Haven."
.
http://www.youtube.com/watch?v=pGycsg4kJvg
.
NSFTL
Regards

sikko6

Fatal accidents involving auto-driving cars are piling up. Tesla, Uber, Google, all have accidents with auto-driving cars. Note that auto-driving cars are very few. Still so many accidents. Auto-driving cars will become weapons of mass fatal accidents. I don't think they can improve significantly. More accidents will happen.