If you fly in the United States, you’ve likely heard of the Aviation Safety Action Program (ASAP) and the Aviation Safety Reporting Service (ASRS). They are perhaps best known for offering robust legal protection if an employee inadvertently violated a regulation.
But while many know about the legal protections these programs offer, immunity is only one of the many benefits provided to both frontline personnel and operators.
To truly appreciate these programs’ vital work, we should first look back at how reporting systems like these came to fruition. As with many other advances in aviation throughout the years, the birth of this too was written in blood.
It is winter of 1974, and two separate flights encounter the same confusion. One aircraft barely misses a collision with a mountain and makes a safe landing, while the other crashes, killing everyone on board.
In October 1974, United Airlines management received a safety report that would later reshape the aviation landscape (they just didn’t know it at the time). One of their flight crews had inadvertently descended below the minimum safe altitude (MSA) on arrival at Washington Dulles Airport (KIAD). Despite inadvertently flying dangerously close to terrain, the aircraft landed safely.
In the report, the crew explain that they were approaching Dulles Airport from the west. From this direction, the flight path takes the aircraft directly over Virginia’s Blue Ridge mountain range, which was obscured by night.
As the aircraft descends through 4000 feet, air traffic control change the assigned runway and instruct the United crew to fly to Runway 12 instead (Image 1.1). The captain reviews the chart for the new approach and comments that they must be 1800 feet at the final approach fix. Complying with what they think is correct, they descend immediately to that altitude and land without incident.
However, it was not until after landing that they realized their error.
The crew had descended to 1800 feet before crossing the mountain range (depicted as “Roundhill” on the approach chart). They hadn’t noticed the text depicting the minimum altitude of 3400 feet, and had been 1600 feet too low. It was by sheer luck that they did not hit terrain.
The crew followed company procedure by filing a formal internal safety report with their airline. In the report, the captain explains that he believed that he could descend immediately to the final approach fix altitude when cleared for the approach under radar control (this was a common misconception at the time, with radar being relatively new).
The airline took the case to the FAA, and after some investigation, the case was closed. A few reminders were sent out to other pilots within the airline, but no industry-wide policy changes or recommendations were ever issued.
Addressed as an isolated incident, it seemed harmless enough. But no one could have guessed that something more significant, more systemic, was going on.
Addressed as an isolated incident, it seemed harmless enough. But no one could have guessed that something more significant, more systemic, was going on.
It is now December, six weeks later, and TWA Flight 514 is approaching Washington DC from the northwest. This time low clouds and fog obscure the mountains.
Strong winds have battered Washington National (KDCA), forcing them to divert to Dulles Airport. The aircraft is 40 miles northwest of the airport when ATC clears them for the approach to Runway 12. The first indication of trouble is being cleared for the approach so far away from the airport.
Like the United crew, “cleared for the approach” prompts the Captain to erroneously descend to the final approach fix altitude. Noticing the 1800 feet, he says: "Eighteen hundred is the bottom," to which the first officer then says, "Start down." They are still on the wrong side of the mountain range.
The second clue that something was amiss is when the flight engineer said, "We're out here quite a ways...better turn the heat off.” For a moment, it seems like the crew will recognize that their descent is premature, but they convince themselves otherwise.
Captain: “You know, according to this dumb sheet, it says thirty-four hundred [feet] to Round Hill --- is our minimum altitude.”
But then all three crew go on to agree that ATC wouldn’t have cleared them for the approach if it wasn’t safe to descend straight to the final altitude.
Captain: "When he clears you, that means you can go to your ---"
Flight Engineer: "Initial approach.”
First Officer: "Yeah!"
Captain: "Initial approach altitude."
The aircraft levels off at 1800 feet and is buffeted around by turbulence.
Captain: "I had ground contact a minute ago."
First Officer: "Yeah, I did too."
At 1109:14, the CVR records the radio altimeter warning, indicating that they’re only 500ft above ground level (AGL).
First Officer: "Boy!"
Captain: "Get some power on!!"
The radio altimeter warning horn sounds again, indicating only 100ft AGL. Two seconds later, the sound of impact is recorded.
The aircraft flew through trees before crashing into a rocky outcrop on Mount Weather at a speed of 230 knots. No one survived.
According to the investigation, a major contributing factor was: “The failure of the FAA to take timely action to resolve the confusion and misinterpretation of air traffic terminology although the Agency had been aware of the problem for several years" (NTSB Report AAR-75-16).
When this accident occurred, ASRS and ASAP did not exist, and safety communication was predominantly 2-dimensional. By this we mean that safety information was communicated from the frontline to managers/regulators, and feedback was sent down from managers/regulators to the frontline. Yet there was a lack of horizontal communication between neighboring organizations and peers.
The FAA realized that had there been the infrastructure to nurture horizontal communication and incident trend monitoring, perhaps the TWA accident could have been avoided with data gathered from the previous United Airlines flight.
This was the pivotal moment when the first confidential, voluntary reporting program in the US was created - ASRS. Years later, ASAP followed.
ASAP’s success can also be partly attributed to it being a trusted system. Filing an ASAP report on an occurrence when federal regulations were violated can grant the reporter immunity (provided the action was not intentionally reckless or criminal). But it is not simply a get out of jail program – individuals who self-disclose their errors or proactively report on hazards may have potentially saved lives or even influenced policy change somewhere down the line.
While hangar talk may be slightly embellished, ASAP provides the most official form of storytelling there is. Under ASAP, each report is de-identified, and data can be shared within the flight department and other operators through the Aviation Safety Information Analysis and Sharing (ASIAS) program.
This allows folks from other operators to learn about dangerous problems in the system and fix them. Think of your report – no matter how small or large the event was – as giving another crew or frontline worker a “heads up” even if it seems insignificant or was just a “close call.”
In a nutshell, here is how you can best utilize an ASAP (or ASRS) program in your flight department:
The tragic and avoidable TWA accident illustrates that every close call has the potential to evolve into something more sinister. Currently, aviation in the US is as safe as ever, but this only means safety efforts should continue to be unified.
New problems that have not yet been encountered will arise, and perhaps old ones we thought were long forgotten rear their ugly heads. Thankfully, through ASAP and the other voluntary reporting programs, we can continue to communicate and report to facilitate the learning and improvement industrywide.
To read more about the other FAA voluntary reporting programs, click here. Not sure what is right for your flight department? Read our blog ASAP or ASRS: Which Is Best For My Flight Department? For information on voluntary reporting programs in other countries or regions, read here.)