As vehicles become increasingly more autonomous, it is critical now more than ever to capitalize on the learnings of accident analyses conducted between 1970 to 2000 on aviation automation.
Valerie Gawron of Mitre wrote a series of seven articles on this topic. The articles are broken down into helpful topics that cover both research and accident analyses conducted before 2000 that bring the lessons of the past to the present. It helps those who are researching and designing the future of autonomy, whether it’s for commercial aviation or surface transportation.
We summarized the seven documents with their main takeaways for our members to easily find their way around.
Definition: In this article, Gawron provides various definitions of automation starting with the seminal one from human factors literature from Warren (1956), taking us through the 10 levels of automation from Parasuraman, Sheridan, and Wicken (2000), and ending with the basic definition offered by Society of Automotive Engineers (SAE, 2013) and National Highway Safety Traffic Administration (NHTSA, 2014) complete with human factors issues found here. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-automation-in-aviation-definition-of-automation.pdf
Accident Analysis: It is critical to see the relationship between automation and accidents in aviation since it is used so widely. This article covers automation-caused accidents. Two reviews are done by the Federal Aviation Administration (FAA) Human Factors Team (1996) and under a National Aeronautics and Space Administration (NASA) contract as part of the Aviation Safety Reporting System (ASRS). In addition, a short article in Air Safety Week (2001) offers a partial list of aviation accidents involving human error related to automation. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-accident-analysis.pdf
The Watching: This article notes performance decrement of World War II radar operators over long periods of sustained attention. In fact, when signals were brief, infrequent, simple, or weak then the decrement was greatest. Several theories were introduced to explain this, including the theory of signal detection (TSD) by Swets, Tanner, & Birdsall (1961). Vigilance decrement has been used to explain the NTSB accident listed in the accident analysis link. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-the-watching-a-review-of-the-vigilance-research.pdf
The Doing: The problem of maintaining skills is a real issue when the expectation is that one should know how to do a skill without ever having a need for it in daily life. This also happens regularly with automated systems where manual skills are rarely, if ever, needed and therefore become non-existent. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-the-doing-review-skill-retention-research.pdf
Is Something Wrong: This article reviews the role of human observers passively monitoring an automated system looking for malfunctions. It comes in contrast with our expectations that the observer can jump in with critical decision making skills in times of crisis. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-is-something-wrong-failure-detection-research.pdf
Nothing Can Go Wrong: This article includes a review of automation-led complacency research. Complacency is defined by the ASRS as “self-satisfaction, which may result in non-vigilance based on an unjustified assumption of satisfactory system state.” Recently, we have observed accidents of semi-automated vehicles wherein the driver over trusted the car’s capabilities. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-nothing-can-go-wrong-automation-induced-complacency.pdf
Guidelines: Automation has been used to increase safety, enhance productivity, and reduce workload but it also has its drawbacks. One of these is system failure that forces the operator to suddenly have to take over for the automation. This article looks at the vulnerabilities of automation and solutions on how to counter them. https://www.mitre.org/sites/default/files/2021-11/pr-16-3426-lessons-lost-automation-in-aviation-guidelines.pdf
A big thank you to Valerie Gawron for having shared with us her recent research.
Do you have an interesting research project to share with ASTG members? Get in touch and we will feature it in a future article!










Leave a comment