FDAI logo   ::  Site Map  ::   
Home  |  About this Website  |  Contact Us
About this Website » ASRS Incident Report Analysis

ASRS Incident Report Analysis
Overview

Introduction

As the design of the modern flight deck continues, more and more computers are being installed to aid the flightcrew. Accidents and incidents involving automated aircraft have drawn attention to potential problems and concerns with these systems because safety-critical tasks that were once performed by the pilot are now being performed by the automation.

The NASA Aviation Safety Reporting System (ASRS) (http://asrs.arc.nasa.gov) database contains reports of incidents submitted voluntarily and anonymously by pilots and others directly involved with the aircraft operations. These reports cover a wide range of aviation related issues and include many reports on incidents involving an automated flight deck.

Top of Page  


Purpose and Scope

The purpose of this research was to use ASRS reports to identify flight deck automation issues and, further, to identify information in the reports that could be considered evidence for automation issues. The scope of this research includes ASRS reports submitted by pilots and covered by Federal Aviation Regulations Part 121/135 crew/automation interaction incidents. FAR Part 121 refers to domestic, flag, and supplemental air carriers and other commercial operators of large aircraft and Part 135 refers to air taxi operators.

Top of Page  


Procedure

In Phase 1 of our study we obtained incident reports from ASRS. 46,798 reports had been submitted to ASRS at the time of our query. To reduce the full database to incidents involving only automated flight decks, we submitted a query to ASRS requesting reports referencing Part 121/135 crew/automation interaction incidents. This allowed us to focus on reports of incidents involving advanced technology aircraft, while excluding reports of incidents involving conventional and general aviation aircraft. This query generated 591 ASRS reports.

We reviewed the narrative sections of the 591 ASRS reports for citations of problems or concerns related to automation and recorded them in a database along with similar citations from other sources. The narrative section of an ASRS report is that part in which the reporter describes the incident in his or her own words. We were specifically looking for excerpts in which the reporter claimed that some automation problem exists or expressed some concern about automation. In identifying these citations, we did not insure that the reporter claimed that the problem or concern contributed to the incident in question. In this phase of the research we were merely attempting to identify issues. In this initial analysis we identified 282 reports citing automation issues.

In Phase 2 we reviewed these 282 reports for excerpts clearly stating that the cited issue contributed to the incident. We recorded these excerpts in our database. We found that 282 of the incident reports cited an automation issue that contributed to the incident.

In Phase 3, we identified additional incident reports that were classified as referencing Part 121/135 crew/automation interaction incidents and were submitted between January 1997 and December 1998. There were a total of 1671 incident reports. We reviewed these 1671 reports for excerpts clearly stating that the cited issue contributed to the incident. We also recorded these excerpts in our database. We found that 608 of those reports had information related to automation issues.

Top of Page  


Results

We found evidence related to flight deck automation issues in a total of 890 incident reports. This total is comprised of the 282 reports we reviewed in Phase 2 and 608 reports we reviewed in Phase 3. A total of 64 separate issues were supported by this evidence. Details are presented in the following table. For each issue, we listed its issue identifier, the abbreviated issue statement, the number of incident reports supporting this issue out of 890 total reports, and the percentage that number represents. The link provides a complete set of incident report narratives supporting that issue.

Abbreviated Issue Statement   Number of Reports Supporting Issue Percentage of Reports Supporting Issue
automation behavior may be unexpected and unexplained (Issue #108)

INCIDENTS

271

22%

pilots may be overconfident in automation (Issue #131)

INCIDENTS

262

22%

automation may demand attention (Issue #102)

INCIDENTS

67

6%

understanding of automation may be inadequate (Issue #105)

INCIDENTS

41

3%

programming may be susceptible to error (Issue #170)

INCIDENTS

41

3%

mode transitions may be uncommanded (Issue #044)

INCIDENTS

35

3%

database may be erroneous or incomplete (Issue #110)

INCIDENTS

31

3%

failure assessment may be difficult (Issue #025)

INCIDENTS

28

2%

displays (visual and aural) may be poorly designed (Issue #092)

INCIDENTS

28

2%

automation use may be vulnerable to cockpit distractions (Issue #171)

INCIDENTS

28

2%

automation may adversely affect pilot workload (Issue #079)

INCIDENTS

26

2%

controls of automation may be poorly designed (Issue #037)

INCIDENTS

21

2%

mode selection may be incorrect (Issue #145)

INCIDENTS

21

2%

crew coordination problems may occur (Issue #084)

INCIDENTS

18

1%

situation awareness may be reduced (Issue #114)

INCIDENTS

17

1%

training may be inadequate (Issue #133)

INCIDENTS

17

1%

automation may lack reasonable functionality (Issue #109)

INCIDENTS

16

1%

false alarms may be frequent (Issue #070)

INCIDENTS

15

1%

flightdeck automation may be incompatible with ATC system (Issue #082)

INCIDENTS

15

1%

data entry and programming may be difficult and time consuming (Issue #112)

INCIDENTS

15

1%

crew assignment may be inappropriate (Issue #142)

INCIDENTS

13

1%

pilots may under-rely on automation (Issue #146)

INCIDENTS

13

1%

pilots may over-rely on automation (Issue #106)

INCIDENTS

11

1%

pilots may be reluctant to assume control (Issue #026)

INCIDENTS

10

1%

automation performance may be limited (Issue #126)

INCIDENTS

10

1%

automation may not work well under unusual conditions (Issue #150)

INCIDENTS

10

1%

pilots may lack confidence in automation (Issue #046)

INCIDENTS

8

1%

both pilots' attention simultaneously diverted by programming (Issue #075)

INCIDENTS

8

1%

mode awareness may be lacking (Issue #095)

INCIDENTS

8

1%

standardization may be lacking (Issue #138)

INCIDENTS

8

1%

failure recovery may be difficult (Issue #023)

INCIDENTS

7

1%

monitoring requirements may be excessive (Issue #005)

INCIDENTS

6

<1%

manual operation may be difficult after transition from automated control (Issue #055)

INCIDENTS

6

<1%

cross checking may be difficult (Issue #072)

INCIDENTS

6

<1%

company automation policies and procedures may be inappropriate or inadequate (Issue #166)

INCIDENTS

6

<1%

insufficient information may be displayed (Issue #099)

INCIDENTS

5

<1%

workarounds may be necessary (Issue #107)

INCIDENTS

5

<1%

inadvertent autopilot disengagement may be too easy (Issue #123)

INCIDENTS

5

<1%

automation may be too complex (Issue #040)

INCIDENTS

4

<1%

manual skills may be lost (Issue #065)

INCIDENTS

4

<1%

data entry errors on keyboards may occur (Issue #071)

INCIDENTS

4

<1%

transitioning between aircraft may increase errors (Issue #130)

INCIDENTS

4

<1%

automation information in manuals may be inadequate (Issue #140)

INCIDENTS

4

<1%

automation use may slow pilot responses (Issue #161)

INCIDENTS

4

<1%

protections may be lost though pilots continue to rely on them (Issue #015)

INCIDENTS

3

<1%

behavior of automation may not be apparent (Issue #083)

INCIDENTS

3

<1%

task management may be more difficult (Issue #167)

INCIDENTS

3

<1%

new tasks and errors may exist (Issue #089)

INCIDENTS

2

<1%

automation level decisions may be difficult (Issue #103)

INCIDENTS

2

<1%

operational knowledge may be lacking in design process (Issue #121)

INCIDENTS

2

<1%

inter-pilot communication may be reduced (Issue #139)

INCIDENTS

2

<1%

automation requirements may conflict (Issue #160)

INCIDENTS

2

<1%

failure modes may be unanticipated by designers (Issue #024)

INCIDENTS

1

<1%

scan pattern may change (Issue #038)

INCIDENTS

1

<1%

interface may be poorly designed (Issue #039)

INCIDENTS

1

<1%

data re-entry may be required (Issue #049)

INCIDENTS

1

<1%

deficiencies in basic aircraft training may exist (Issue #063)

INCIDENTS

1

<1%

human-centered design philosophy may be lacking (Issue #100)

INCIDENTS

1

<1%

testing may be inadequate (Issue #115)

INCIDENTS

1

<1%

automation may use different control strategies than pilots (Issue #122)

INCIDENTS

1

<1%

automation skills may be lost (Issue #137)

INCIDENTS

1

<1%

instructor training requirements may be inadequate (Issue #143)

INCIDENTS

1

<1%

non-automated pilot tasks may not be integrated (Issue #153)

INCIDENTS

1

<1%

planning requirements may be increased (Issue #158)

INCIDENTS

1

<1%

Top of Page  


Discussion

Our results indicate that while pilots are generally very accepting of flight deck automation, there is evidence to suggest that they do have concerns about its design, function, and use. Furthermore, these concerns are justified in that the characteristics of automation underlying them do contribute to incidents.

We understand that ASRS reports are submitted on a voluntary basis and are subject to self-reporting bias. However, incident reports offer perhaps the only source of information on line operations that is open to this kind of analysis. As such, incident reports offer important insights into the operational environment.

Top of Page  

  Last update: 12 July 2007 Flight Deck Automation Issues Website  
© 1997-2013 Research Integrations, Inc.