The Self-Driving Car Industry Experiences Its First Pedestrian Death

This is rather sad technology news, the type this industry isn’t used to. The self-driving car industry has experienced its first pedestrian death when an autonomous Uber car struck and killed a woman in Tempe, Arizona.

There are many different questions to be asked. Were we tempting fate by thinking this could even be an option? Yet human-driven cars sadly experience traffic deaths all the time. Was this the fault of the Uber car technology? Or was this just a quirk of fate? How did this happen?

Autonomous cars are suddenly a big industry. Google, Apple, Uber, and others are all testing vehicles and their technology, trying to be the first to sell to the public.

Officials in Arizona invited these companies to their state to use the state roads to test the self-driven cars. Tempe seemed to be an ideal location because of its dry weather as well as the wide roads.

In return Arizona went easy on these companies with regards to regulations, realizing they had to do something to lure the companies out of California. They decided the state would be a regulation-free zone.

We needed our message to Uber, Lyft, and other entrepreneurs in Silicon Valley to be that Arizona was open to new ideas,” said Arizona Governor Doug Ducey last year.

news-uber-death-traffic

Ducey issued an executive order allowing the testing of autonomous vehicles with safety drivers at the wheel in case of an emergency. This month he had decided to allow cars without safety drivers to test the cars, realizing that a “business-friendly and low regulatory environment” had helped the economy.

The tech companies and automakers testing these cars believe their cars will be safer than human-driven cars because distracted humans won’t factor in.

On Sunday night a self-driven Uber Volvo XC90 sport utility vehicle, that had a safety driver behind the wheel as well, was on the road in Tempe. There were no passengers in the vehicle when it hit 49-year-old Elaine Herzberg.

A Temple police spokesman, Sgt. Ronald Elcock, said in a news conference that the preliminary investigation showed the Uber car was driving around 40 miles per hour at the time of the accident.

Herzberg had been walking with her bicycle. It didn’t appear that the car even slowed down before it made contact. The weather was clear and dry, and the safety driver didn’t seem to be impaired.

“Our hearts go out to the victim’s family,” said Sarah Abboud, an Uber spokeswoman, in a statement. “We are fully cooperating with local authorities in their investigation of this incident.”

news-uber-death-stop

Uber has now acted to suspend the testing, a move that Tempe Mayor Mark Mitchell calls a “responsible step,” though he hopes people don’t draw conclusions on autonomous driving prematurely.

The National Transportation Safety Board is sending a team of investigators to Tempe to take a look at “the vehicle’s interaction with the environment, other vehicles, and vulnerable road users such as pedestrians and bicyclists.”

So where does this leave us with regards to the autonomous car industry?

Lawmakers are working on a Senate bill that, if passed, would remove the self-driven car makers from existing safety standards and prevent states from creating their own laws with regards to vehicle safety. Similar legislation has passed in the House already.

This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” said Senator Richard Blumenthal.

When we go back to those questions we asked in the beginning of this article, we still have no answers and may perhaps even have more questions. Sen. Blumenthal is right that this accident proves we still have a long way to go before trusting these cars out on the road if they can’t even be trusted when they have the benefit of both technology and a safety driver.

13 comments

    • I read a followup that said they believe that the accident was unavoidable, that whether it was an autonomous car or a human-driven one, the accident would have happened regardless. The pedestrian was not seen until the point of impact.

      • In a 2/24/2019 USA Today article Raj Rajkumar, leader of Carnegie Mellon’s autonomous vehicle research team is quoted as saying that the neither the car’s LiDAR nor radar performed the way it was supposed to. While the car’s camera would be useless in picking up obstacles in the dark, the LiDAR and radar should have picked up the victim long before they did. Rajkumar also said that “LiDAR systems have a blind spot of around 30 feet immediately surrounding the vehicle”.

        So, the accident was definitely avoidable had all the detection systems worked as intended.

    • Presumably not blind and definitely not asleep. And presumably she had the ability to take control of the car, but there was no time for her to do so and it wouldn’t have mattered if she did…with only 2 seconds notice, there was nothing anyone/anything could have done. Check out the video of the incident I linked to in another message.

  1. What this article doesn’t mention is the conditions of the collision:

    – it was late at night, thus very dark…dark enough that what is merely a few yards to the right of the road can’t be identified, although it is most likely trees or bushes of some sort.

    – the pedestrian is not in a crosswalk.

    – as far as I can see, the pedestrian is not wearing reflective material of any kind, and there appear to be no reflectors on the bicycle she’s pushing.

    Here’s video from the car that was released late on the 21st: https://globalnews.ca/video/rd/1191720515742/ (be advised that the next video auto-starts after about 10 seconds, but there’s a ‘cancel’ button to prevent that). The exterior camera shows that the pedestrian doesn’t become visible at all until about 2 seconds before the collision, so there’s no way that the collision could have been avoided. Had the car been equipped with an automatic braking or collision avoidance system, it might have been slowed down enough so that the pedestrian was just injured (possibly badly) rather than killed, but that’s about it. With only 2 seconds to react, the results wouldn’t have been any different had the car been under human control rather than autonomous control, so there’s no way this collision can be blamed on the self-driving vehicle.

    As to the “driver”, I have no idea what she was looking down at…do self-driving cars have a monitor that displays the exterior camera so the “driver” can see what the car “sees” and so she was checking that? It’s really irrelevant, though, as she was looking out the windshield in time to see the pedestrian as she became illuminated (as evident from the “drivers” reaction), and with only 2 seconds notice, there’s nothing she could have done to prevent the collision.

    • “there’s no way this collision can be blamed on the self-driving vehicle.”
      Bovine excrement! If autonomous cars will rely on visual recognition systems or collision avoidance systems that “see” only several yards ahead and directly in front of the vehicle, then we and the autonomous cars are in deep doodoo. As any human driver knows, one has to look at least 50-100 yards down the road and be constantly scanning from side to side for any possible problems and to have time to react to those problems. If a self-driving car cannot do at least the same, it does not belong on the road. Technology is supposed to provide better detection than human senses.

      “As to the “driver”, I have no idea what she was looking down at”
      She had nothing to worry about, she was in an autonomous car that was supposed to be able to do everything a human driver can do, and more. She was in the car not to drive it but to provide a “human presence” which the State of Arizona required as a condition of issuing the permit to Uber.

      As long as there are $Billions in potential profits involved, autonomous car manufacturers will be cutting corners and putting unsatisfactory products on the road just to be the “firstest with the mostest”. The first company to the market with a product usually winds up controlling it, or at least grabbing a big chunk of it..

  2. “There are many different questions to be asked. Were we tempting fate by thinking this could even be an option?”

    Absolutely not. It simply isn’t possible to fully account for human stupidity in any system. I’m reminded of the adage “If you make something idiot-proof, someone will just make a better idiot.” Lane departure systems, adaptive cruise control systems and collision avoidance systems go a long way to account for human stupidity, but it simply can’t be eliminated. If we’re going to worry that such advances in transportation will be “tempting fate”, we may as well remove all motor vehicles and bicycles from existence so that the humans that are too stupid to protect themselves will continue to be safe.

    “Was this the fault of the Uber car technology?”

    No. Absolutely not. Since the pedestrian wasn’t wearing reflective gear so she’d be easily seen in the dark, and was crossing the road where there wasn’t a crosswalk, this was entirely her fault. This may seem a little harsh, but at some point people have to start taking responsibility for their own actions. She chose to wear dark clothing without reflective strips in the middle of the night and cross the road where she shouldn’t have, and the inevitable happened…and it would have happened even if the “driver” was in control of the car or if it hadn’t been a self-driving car.

    “How did this happen?”

    Human stupidity.

    “This tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads,” said Senator Richard Blumenthal

    It makes clear no such thing. What this incident *does* make clear is that some pedestrians are idiots…not that there was ever any doubt of that. I’ve seen pedestrians that were so involved in their phones (and even paper books) that they’ve walked up to and right out into an intersection without even looking up to see if it was safe to do so. If we’re going to wait until things are “truly safe” for pedestrians before we have improvement/advancements, we’re going to be waiting a very, very long time.

    “Sen. Blumenthal is right that this accident proves we still have a long way to go before trusting these cars out on the road if they can’t even be trusted when they have the benefit of both technology and a safety driver”

    I’m going to assume that this article was written, and Sen. Blumenthal’s comment was made, before it was known what actually happened, since trust in autonomous cars is not in question in this incident. It’s actually trust in pedestrians to be smart and obey the law that’s in question in this incident…and the pedestrian in question was neither. If autonomous cars are going to be rejected because they can’t avoid hitting pedestrians that appear “out of nowhere” a few feet in front of them, then we might just as well remove **all** vehicles from the roads, because human drivers can’t do that either.

    • You’re playing the old “blame the victim” gambit.

      Granted, the victim made several stupid mistakes but she paid the ultimate penalty. However, no matter how much blame is laid at the victim’s feet, the fact remains that that particular model of autonomous car is not ready for the open road, let alone to transport passengers.

    • ” It simply isn’t possible to fully account for human stupidity in any system.”
      That is a fatuous statement because fully accounting for human stupidity is what is required before autonomous vehicles are allowed out in the wild. There already are many autonomous systems in use in industry that seem to have fully accounted for human stupidity. I’m sure that if a human was killed or injured by an autonomous system on the assembly line or in a factory, the news would be splashed on the front pages of all the newspapers and be the lead story on nightly TV news.

  3. At WHAT point is this the fault of a person who sees a vehicle approaching WITH A DRIVER BEHIND THE WHEEL? In the state of NYC? (where I’m from? you could be a pedestrian standing in the middle of the crosswalk wearing headphones with your eyes closed!) if a vehicle even TOUCHES you, the driver is hauled off to jail! Why?…because any judge with common sense will tell you that although the PEDESTRIAN is SUPPOSED to look where they are / going?…..the PERSON BEHIND THE WHEEL OF A TWO TON VEHICLE IS SUPPOSED TO BE ABLE TO STOP THAT VEHICLE AT THE FIRST SIGN OF TROUBLE IN THE ROAD. Why is this concept so hard for people to grasp?The first reaction we have when driving along a street, or highway and we see police lights is to SLOW DOWN….when its late at night why would this driver-less car not “sense” the area is dark and there is a high probability rate of hitting “something”…whether its a person…..a rabbit…a fox…etc. Driverless cars will NEVER replace the ability of a human to use their natural reflexes to prevent an incident. I can also tell you that a autonomous car with all its fancy-tech attached to it?….doesn’t have a “Common Sense” program built in. Any REAL driver knows…..when its dark and you CANNOT SEE A CERTAIN DISTANCE IN FRONT OF THE VEHICLE YOU SLOW THE HELL DOWN! Do I blame the driver of the car in this instance? Yes and no. Uber might have told you you’re only there as a backup? but she could have just taken control of the vehicle and explained it later at the main office. I’ll bet right now?…Uber is WISHING that she did EXACTLY that!. Instead of having to deal with the impending lawsuit that you must KNOW will follow this incident. There will never be a time when a microchip can “feel” situations better than a human being. There’s no “gut reaction” no “hunch” built-in to a microprocessor. The quicker the ignorant people who keep trying to force this technology down our throats realize that, the easier life will be. Uber and any other company that is promoting this foolishness in the name of making profit?…better stick to test tracks for the long term future, because if this happens again ANYWHERE else on the planet?…they might get sued out of existence.

    Freekin’ Idiots.

  4. To point out further:
    You ever sit-in on a rear-ending driver accident case?…..while the judge does indeed question the driver who was rear ended….trying to find out if they were texting….not looking ahead etc. The judge comes down TWICE as hard on the person who hit the first driver, because he will state: Yes…he might have stopped short….he might have been looking at his phone….but YOU’RE BEHIND HIM….and YOU SHOULD BE able to stop YOUR vehicle at the first sign of trouble! So this is in NO way the pedestrian’s fault….the DRIVER (or if you want to get technical about it!) UBER should have sensed a human being in the road, (you mean to tell me with all their fancy tracking/sensors they don’t have one for body heat?…..what about something that picks up a HEARTBEAT!?….No?…nothing like that exists for vehicles yet?)

    Then the don’t belong on ANY road.

    Period.

    • “UBER should have sensed a human being in the road”
      The tracking systems on autonomous cars are supposed to do just that. However, the visual recognition system is useless in the dark. The LiDAR and RADAR systems apparently malfunctioned.

      With $Billions at stake, do you think that a death of a few humans will stop the mad race to be the firstest with an commercially viable autonomous car? The attitude will be “What’s a few deaths now if we can save thousands in the future?”

Leave a Comment

Yeah! You've decided to leave a comment. That's fantastic! Check out our comment policy here. Let's have a personal and meaningful conversation.

Sponsored Stories