...Two weeks ago, The Drive published "The Human Driving Manifesto," in which I claimed there was absolutely no evidence self-driving cars were safer than humans—at least not yet—and that we have a moral obligation to improve human driving safety even regardless.
Little did I know how prescient that would turn out to be.
Yesterday I wrote "Elaine Herzberg's Death Isn't Uber's Tragedy. It's Ours," in which I called out the hypocrisy of a country that tolerates 100 deaths by human drivers a day, but won't tolerate one by machine. I was referring, of course, to the tragic death of Elaine Herzberg, who was struck and killed by a self-driving Uber test vehicle this past Sunday in Tempe, Arizona, just one of ten pedestrians killed in that state last week.
I was trying to give Uber the benefit of the doubt. I was wrong.
Not only was I wrong, but The Human Driving Manifesto—which I jokingly wrote in response to the ever increasing storm of self-driving clickbait—was more accurate than I ever could have guessed, because now that the Tempe police have release dashcam footage of the fatal crash, all of the following points are perfectly clear:
Uber is guilty of killing Elaine Herzberg.
Uber's hardware and/or software failed.
Many people at Uber need to be fired.
The Arizona officials who greenlit testing need to resign.
One or more people need to be prosecuted.
The SAE Automation Classification System is vague and unsafe.
Uber is the Theranos of self-driving.
Volvo—one of the few car makers that truly cares about safety—is innocent and shouldn't be in bed with their craven opposites.
Even if you believe self-driving cars may someday reduce road fatalities—and I do believe that—this dashcam video is an icepick in the face of the argument that anyone at Uber gives a damn about anyone's safety, including that of their own test drivers.
I've long suspected that 99% of claims from self-driving companies were BS, but I didn't think it was this bad:
...
A slow moving pedestrian at night—well beyond human line of sight—is precisely what radar and Lidar sensors are supposed to see. This is precisely the type of crash self-driving cars are designed to prevent.
...
What is the purpose of a safety driver? To take control—whether it's steering or braking—in order to prevent an impact the self-driving car cannot. That didn't happen here. Why not? Partially because it was at night and the headlights may not have illuminated Herzberg until it was too late, and partially because the safety driver wasn't paying attention. The safety driver doesn't appear to have applied the brakes until after the impact, further indicating lack of readiness. I'm not convinced this particular "safety" driver could have done better even in daylight. Her eyes are glued to whatever device is in her hand.
The safety driver certainly bears some moral responsibility, and depending on the nature of her employment contract, she may bear some legal responsibility as well.
And that's before we know anything about what kind of training, if any, Uber gives its "safety" drivers.
Oh, did I mentioned that the driver had a history of traffic violations dating back to 1998? And that Uber claimed she passed all background checks? Uber, you've got a minimum standard problem.
...
Connect With Us