Zombie Story
The story that just...will...not...die.
Controllers Were Slow to Notify Defense Command of Errant Jet
”Air traffic control supervisors delayed nearly an hour in notifying Norad, the military air defense command, that a Northwest Airlines jetliner was not responding to radio calls, the head of the Federal Aviation Administration said Friday.“
I really wonder, in 2-3 months when the investigation is over, will the story gain any attention then ? When it counts ? I bet not.
Even Mr. Wald’s (the reporter of the story) attempt to explain the unexplainable sounds weak.
”The incident did not result in any damage or injury. But in a period of worry over distracted driving, it has attracted widespread public attention.“
Whatever. When the report comes out and we are reminded (again) that we don’t have an effective system for passing critical information like the fact an aircraft is NORDO -- everybody will yawn and the editors will give it a pass. Until we put two of them together that is. Everybody will care then.
It’s the age-old curse of the safety people. Nobody cares until it is too late. Or, as I used to say, it’s never a problem until it’s a problem. And then it’s really a problem.
CHI99IA100A
”The aircraft came within an estimated 1/2 mile horizontal and 0 feet vertical separation ... “
”ATC recognized that both airplanes were NORDO but were unable to reestablish radio....“
”...neither airplane was equipped with TCAS.“
Just a half mile and it could have really been a problem. A problem that we would have solved. For those that don’t know, the NTSB used this incident to push for the installment of TCAS on freighters. They -- not without cause -- thought it would add in another safety layer to the system. And in a way, it did. But TCAS isn’t perfect. The problem with NORDO is growing and the solution is found where the problem starts -- at the controller’s position.
I hate quoting myself but....
”No person, no machine -- no system -- is infallible. All of them have limitations. You can't continuously push a machine past it's limits and not expect it to fail. You can't give people an unusual amount of tasks to accomplish on a regular basis and not expect them to fail. We can't make an unlimited number of errors -- even small ones -- and not expect the system to fail. The FAA, in the training video, notes 11 separate "links" (or errors) in the chain of events. Each error was what we tend to think of as a "small" error. But as the magnitude of this event sinks in you realize that there isn't any such thing as a "small" error in this business.“
If we allow the number of NORDOs to increase -- or even remain the same -- without developing a better way of handling and reacting to the information, the system will fail.
Don Brown
November 14, 2009
Comments