PDA

View Full Version : Drive free or Die: how automated transportation technology may end up killing you.




Anti Federalist
12-12-2011, 10:26 PM
I'll run the risk, and keep my hand on the wheel, or the stick, or the rudder, as the case may be.

This is not the first Airbus (which is all fly by wire) that has crashed due in part to system failure.


The secret danger of safety technology

http://jalopnik.com/5866678/the-secret-danger-of-safety-technology

The chilling transcripts from Air France 447 reveal the central conflict dooming the 228 people aboard the plane was not between two inexperienced pilots, but between a pilot and the technology that was supposed to keep him safe that he didn't understand.

There's a serious lesson in this, not only for the aviation industry, but for automakers who hope to make cars safer by taking the controls away from drivers: Don't do it.

An incredible detailed analysis of the recently released transcripts assembled by Popular Mechanics makes a credible case that one of the two junior officers in charge of an Air France flight that crashed in June 2009 was of the belief that he couldn't crash the plane and thus made poor decisions because of misunderstanding the complex systems designed to protect the aircraft.

Junior officer Pierre-Cédric Bonin flew the Airbus 330 into the middle of an intense thunderstorm when, suddenly, the pitot tubes on the exterior of the airplane designed to measure airspeed iced over. When this happens the plane goes from a more controlled autopilot mode to one that puts more control in the hands of the pilot.

Other than losing a sensor, the plane was completely capable of flying. What caused the plane to crash isn't the failure of the sensor, but Bonin's reaction to it. He pulled back on the side stick, which caused the plane to climb. A stall warning was issued, and never stops, but the pilots didn't react to it.

Assuming they're in a mode that makes it nearly impossible to crash the plane, the loss of sensor data causes them to act not irrationally, but counter-intuitively. To try to ascend when they should point the nose down to grab airspeed. To ignore the fact that they were flying towards the ocean at high speed.

Pop Mech ends the transcript with this important and unnerving question:


But the crash raises the disturbing possibility that aviation may well long be plagued by a subtler menace, one that ironically springs from the never-ending quest to make flying safer.

[...]

While the airplane's avionics track crucial parameters such as location, speed, and heading, the human beings can pay attention to something else. But when trouble suddenly springs up and the computer decides that it can no longer cope-on a dark night, perhaps, in turbulence, far from land-the humans might find themselves with a very incomplete notion of what's going on. They'll wonder: What instruments are reliable, and which can't be trusted? What's the most pressing threat? What's going on?

Aren't we doing the same thing with cars? Aren't automakers trying to take away the basic controls from the driver and, thus, situational awareness?

I'm not talking about self-driving vehicles, which aren't that far in the future, although it's worth noting that the first Google self-driving car crash occurred after the vehicle switched over from self-driving to human driving.

The best example of this now is adaptive cruise control, which uses radar or laser sensors to allow the driver to "follow" other cars without adjusting throttle or brake inputs. Basically, the driver sets their desired speed and pulls behind another vehicle. If they select 65 mph and pull behind a vehicle traveling below 65 the main car will slow down to the lead car's speed at a certain distance.

In practice, it works quite well 90% of the time. Unfortunately, because the beams shoot straight ahead in most vehicles they read a car going around a bend as "no car" and leap forward. This leads to people thinking their adaptive cruise control is broken and causing them to accelerate dangerously (usually around a curve).

Apple co-founder Steve Wozniak believed his Toyota Prius was faulty and many thought he just didn't understand how the cruise control worked.

If that's the case, and I don't know for sure, what does it say that someone so technologically aware and advanced doesn't fully understand how the system works? I don't think it's the fault of the driver.

I think it's the fault of the company that makes a system that takes away basic information and control from the driver. The system works extremely well when it is fully in control, but when it loses a key piece of information and requires an input from a human — just like in the tragic Air France flight — things can go terribly wrong.

I'm not anti-technology. I think technology can enhance and improve safety. Driverless cars and similar tech is designed primarily as safety equipment. Great. But instead of worrying if we've designed an autonomous system smart enough to drive without crashing when engaged, we need to worry about whether we've designed autonomous systems that are smart enough to not make us dumb when they disengage.

Informed technology and uninformed users can be a deadly combination.

Kylie
12-12-2011, 10:48 PM
I will keep my manual transmission, non-onstar having car thank you very much. Not only will it not kill me, but most wont ask to borrow it since they cant drive a stick shift.

Winning!

Rael
12-13-2011, 01:20 AM
Their problem was using Otto The Inflatable Autopilot


http://www.youtube.com/watch?v=N4Ox4cyOxWA