PDA

View Full Version : Self Driving Cars Prove its Impossible to Follow Traffic Laws




presence
09-03-2015, 08:05 AM
Google’s Driverless Cars Run Into Problem: Cars With Drivers

By MATT RICHTEL (http://topics.nytimes.com/top/reference/timestopics/people/r/matt_richtel/index.html) and CONOR DOUGHERTY (http://topics.nytimes.com/top/reference/timestopics/people/d/conor_dougherty/index.html)SEPT. 1, 2015

Photo http://static01.********/images/2015/09/02/business/02googlecar-web/02googlecar-web-master675.jpg

A Google self-driving car in Mountain View, Calif. Google cars regularly take the most cautious approach, but that can put them out of step with the other vehicles on the road. Credit Gordon De Los Santos/Google




MOUNTAIN VIEW, Calif. — Google (http://topics.nytimes.com/top/news/business/companies/google_inc/index.html?inline=nyt-org), a leader in efforts to create driverless cars, has run into an odd safety conundrum: humans.

Last month, as one of Google’s self-driving cars (http://www.nytimes.com/2015/05/16/technology/google-to-test-bubble-shaped-self-driving-cars-in-silicon-valley.html) approached a crosswalk, it did what it was supposed to do when it slowed to allow a pedestrian to cross, prompting its “safety driver” to apply the brakes. The pedestrian was fine, but not so much Google’s car, which was hit from behind by a human-driven sedan.

Google’s fleet of autonomous test cars is programmed to follow the letter of the law. But it can be tough to get around if you are a stickler for the rules.




One Google car, in a test in 2009, couldn’t get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google’s robot.




[Those who would give up essential Liberty,
to purchase a little temporary Safety,
deserve neither Liberty nor Safety. -Ben]





It is not just a Google issue. Researchers in the fledgling field of autonomous vehicles say that one of the biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book. “The real problem is that the car is too safe,” said Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles.



Traffic wrecks and deaths could well plummet in a world without any drivers, as some researchers predict. But wide use of self-driving cars is still many years away, and testers are still sorting out hypothetical risks — like hackers — and real world challenges, like what happens when an autonomous car breaks down on the highway.


For now, there is the nearer-term problem of blending robots and humans. Already, cars from several automakers have technology that can warn or even take over for a driver, whether through advanced cruise control or brakes that apply themselves. Uber is working on the self-driving car technology, and Google expanded its tests in July to Austin, Tex.


Google cars regularly take quick, evasive maneuvers or exercise caution in ways that are at once the most cautious approach, but also out of step with the other vehicles on the road.


“It’s always going to follow the rules, I mean, almost to a point where human drivers who get in the car and are like ‘Why is the car doing that?’” said Tom Supple, a Google safety driver during a recent test drive on the streets near Google’s Silicon Valley headquarters.


Since 2009, Google cars have been in 16 crashes, mostly fender-benders, and in every single case, the company says, a human was at fault. This includes the rear-ender crash on Aug. 20, and reported Tuesday by Google. The Google car slowed for a pedestrian, then the Google employee manually applied the brakes. The car was hit from behind, sending the employee to the emergency room for mild whiplash.


Google’s report on the incident adds another twist: While the safety driver did the right thing by applying the brakes, if the autonomous car had been left alone, it might have braked less hard and traveled closer to the crosswalk, giving the car behind a little more room to stop. Would that have prevented the collision? Google says it’s impossible to say.


There was a single case in which Google says the company was responsible for a crash. It happened in August 2011, when one of its Google cars collided with another moving vehicle. But, remarkably, the Google car was being piloted at the time by an employee. Another human at fault.


Humans and machines, it seems, are an imperfect mix. Take lane departure technology, which uses a beep or steering-wheel vibration to warn a driver if the car drifts into another lane. A 2012 insurance industry study that surprised researchers found that cars with these systems experienced a slightly higher crash rate than cars without them.


Bill Windsor, a safety expert with Nationwide Insurance, said that drivers who grew irritated by the beep might turn the system off. That highlights a clash between the way humans actually behave and how the cars wrongly interpret that behavior; the car beeps when a driver moves into another lane but, in reality, the human driver is intending to change lanes without having signaled so the driver, irked by the beep, turns the technology off.


Mr. Windsor recently experienced firsthand one of the challenges as sophisticated car technology clashes with actual human behavior. He was on a road trip in his new Volvo, which comes equipped with “adaptive cruise control.” The technology causes the car to automatically adapt its speeds when traffic conditions warrant.


But the technology, like Google’s car, drives by the book. It leaves what is considered the safe distance between itself and the car ahead. This also happens to be enough space for a car in an adjoining lane to squeeze into, and, Mr. Windsor said, they often tried.http://www.nytimes.com/2015/09/02/technology/personaltech/google-says-its-not-the-driverless-cars-fault-its-other-drivers.html?_r=0

CaptUSA
09-03-2015, 08:14 AM
http://www.ronpaulforums.com/showthread.php?481485-Damned-Humans!

acptulsa
09-03-2015, 08:16 AM
So, why is it every time the mainstream media reports that humans are still more competent and efficient than machines, and computers are still stupid, it comes out sounding like we're all evil bastards and the only thing standing between us and utopia is us?

Which is more efficient--human Chicagoans slipping through a four way stop, each in his turn, right on the bumper of the car that was on our right, after only a 'slight tap on pedal' at the sign, or a bunch of computer cars sitting at the four way stop for five seconds each making sure nothing within sensor range is moving?

Ronin Truth
09-03-2015, 08:44 AM
And they will absolutely never ever get any better. :rolleyes:

nobody's_hero
09-03-2015, 10:58 AM
So in the future, instead of an annoying seatbelt alarm, I'm going to have a car that won't even move unless I put on my seatbelt.

The woods are getting full with all the things I throw in it.

Anti Federalist
09-03-2015, 11:18 AM
Ugh, ya'll can have it.

Anti Federalist
09-03-2015, 11:21 AM
So in the future, instead of an annoying seatbelt alarm, I'm going to have a car that won't even move unless I put on my seatbelt.

The woods are getting full with all the things I throw in it.

My woods are stacked 10 feet high.

No, you will have a car that will not go at all.

"Felony violation of mandatory safety restraint rules and Google Terms of Service.

You are now locked down.

Please sit in your Google brand people-pod quietly, and await SWAT, which will be on scene shortly."

Anti Federalist
09-03-2015, 11:22 AM
I dunno comrade, that human stuff doesn't sound very safe.



So, why is it every time the mainstream media reports that humans are still more competent and efficient than machines, and computers are still stupid, it comes out sounding like we're all evil bastards and the only thing standing between us and utopia is us?

Which is more efficient--human Chicagoans slipping through a four way stop, each in his turn, right on the bumper of the car that was on our right, after only a 'slight tap on pedal' at the sign, or a bunch of computer cars sitting at the four way stop for five seconds each making sure nothing within sensor range is moving?

Slave Mentality
09-03-2015, 07:41 PM
I can't wait until the kiddos figure out that they can paint stop bars across the road to fuck with these things. Remote controled manbears in the intersection would be cool too.

2young2vote
09-03-2015, 08:04 PM
Ban human drivers. Hundreds of children die every year because of lazy human drivers. You have no right to put other's lives at risk just for the sake of being able to drive yourself. Can you even give one good reason for owning a manual-driven car when studies have shown that self-driving vehicles are hundreds of times safer? Yeah, didn't think so..../s

Ahhh, i can see it now. The liberal cause of the future :)