FDAI logo   ::  Site Map  ::   
Home  |  About this Website  |  Contact Us
About this Website » Phase 1 Primary Taxonomy

Primary Taxonomy of Flightdeck Automation Problems and Concerns

Following is the primary taxonomy used to organize the perceived problems and concerns we identified in our study of flightdeck automation (see the Phase 1 Report). Each category is denoted by a J- (justification), D- (design), or U- (use) code and described by an assertion about automation. Each specific perceived problem or concern is denoted by a pc-code and described with a short phrase. Each category or problem/concern is followed by the count of the citations falling under that category or classifed as that problem/concern, and the percentage of total citations that represents [count %]. Statistics for higher level categories are inclusive of lower level categories.

Top of Page  

J0: The justification of automation may be economy, not safety. [3 0.1%]

J1: Automation may exist primarily due to commercial incentives. [3 0.1%]
pc127 commercial incentives may dominate [3 0.1%]
Top of Page  

D0: Automation may be poorly designed. [1084 44.6%]

D1: Automation function and logic may be poorly designed. [471 19.4%]

D1.1: Automation may lack the functionality or performance desired by pilots. [91 3.7%]
pc109 automation may lack reasonable functionality [16 0.7%]
pc107 workarounds may be necessary [10 0.4%]
pc120 automation operation may be based on few variables [8 0.3%]
pc125 design specifications may be inadequate [18 0.7%]
pc126 automation performance may be limited [13 0.5%]
pc150 automation performance may be reduced at margins of envelope [5 0.2%]
pc121 operational knowledge may be lacking in design process [10 0.4%]
pc115 testing may be inadequate [10 0.4%]
pc160 automation requirements may conflict [1 <0.1%]

D1.2: Automation may fail to perform according to pilot expectations. [131 5.4%]
pc108 automation behavior may be unexpected and unexplained [124 5.1%]
pc24 failure modes may be unanticipated by designers [7 0.3%]

D1.3: Automation may not control the aircraft the way pilots do. [3 0.1%]
pc122 automation may use different control strategies than pilots 3 0.1%]

D1.4: Automation may be too complex. [73 3.0%]
pc40 automation may be too complex [46 1.9%]
pc45 modes may proliferate [17 0.7%]
pc124 automation may be too complex and tightly coupled [4 0.2%]
pc128 complex automation may have overly simplistic interface [6 0.2%]

D1.5: Automation design may not be humancentered. [25 1.0%]
pc100 humancentered design philosophy may be lacking [23 0.9%]
pc153 nonautomated pilot tasks may not be integrated [1 <0.1%]
pc117 function allocation may be difficult [1 <0.1%]

D1.6: Automation may usurp pilot authority. [74 3.0%]
pc22 communication between computers may be unsupervised [3 0.1%]
pc44 mode transitions may be uncommanded [32 1.3%]
pc58 envelope protections may limit pilot authority [6 0.2%]
pc43 disengagement may be impossible [5 0.2%]
pc12 pilots have responsibility but may lack authority [28 1.2%]

D1.7: Automation protections which pilots rely upon can be lost. [1 0.0%]
pc15 protections may be lost though pilots continue to rely on them [1 <0.1%]

D1.8: Automation may not be standardized. [51 2.1%]
pc138 standardization may be lacking [49 2.0%]
pc134 software versions may proliferate [1 <0.1%]
pc149 similarity may be superficial [1 <0.1%]

D1.9: Automation may be poorly integrated. [14 0.6%]
pc11 automation integration may be poor [14 0.6%]

D1.10: Automation documentation may be inadequate. [8 0.3%]
pc140 printed media may be inadequate 8 0.3%]

D2: Pilot/automation interfaces may be poorly designed. [564 23.2%]
pc39 interface may be poorly designed [73 3.0%]
pc112 programming may be difficult [26 1.1%]
pc96 data input prompts may be poor [2 <0.1%]

D2.1: Automation controls may be poorly designed. [154 6.3%]
pc37 controls of automation may be poorly designed [66 2.7%]
pc71 data entry errors on keyboards may occur [71 2.9%]
pc94 data entry format may be inflexible [10 0.4%]
pc49 data reentry may be required [1 <0.1%]
pc123 inadvertent autopilot disengagement may be too easy [6 0.2%]

D2.2: Automation displays may be poorly designed. 309 [12.7%]
pc92 displays may be poorly designed [74 3.0%]
pc47 data access may be difficult [6 0.2%]
pc9 information integration may be required [15 0.6%]
pc162 auditory displays may be poorly designed [5 0.2%]

D2.2.1: Automation may obscure its own mode (state) and behavior from pilots. [156 6.4%]
pc83 behavior of automation may not be apparent [41 1.7%]
pc51 feedback may be poor [30 1.2%]
pc30 sidesticks may not be coupled [4 0.2%]
pc53 vertical profile visualization may be difficult [3 0.1%]
pc95 mode awareness may be lacking [72 3.0%]
pc147 pilots may misunderstand automation intent [6 0.2%]

D2.2.2: Automation may obscure situation information from pilots. [22 0.9%]
pc32 trend information may be lacking [3 0.1%]
pc152 state prediction may be lacking [4 0.2%]
pc87 data may be too abstract [3 0.1%]
pc28 maintenance information may be inaccessible [3 0.1%]
pc99 insufficient information may be displayed [9 0.4%]

D2.2.3: Automation may provide too much information. [31 1.3%]
pc14 information overload may exist [31 1.3%]

D3: Automation may not be compatible with the ATC system. [49 2.0%]
pc82 flightdeck automation may be incompatible with ATC system [36 1.5%]
pc157 automation may conflict with ATC [11 0.5%]
pc148 traffic coordination requirements may increase [1 <0.1%]

D4: Cultural differences may not be considered in the design of automation. [1 0.0%]
pc165 cultural differences may not be considered [1 <0.1%]
Top of Page  

U0: The use of automation may lead to problems. [1341 55.2%]

U1: The fact that automation is used may lead to problems. [568 23.4%]

U1.1: The fact that automation is used may lead to problems for pilots. [547 22.5%]

U1.1.1: Pilots may not perform as well when using automation. [1 0.0%]
pc161 automation use may slow pilot responses [1 <0.1%]

U1.1.2: Pilots may have difficulty assuming control from automation. [12 0.5%]
pc55 manual operation may be difficult after transition from automated control [12 0.5%]
U1.1.3: Pilots may have difficulty recovering from automation failures. [16 0.7%]
pc23 failure recovery may be difficult [16 0.7%]

U1.1.4: Pilot roles may be different in automated aircraft. [74 3.0%]
pc144 pilot's role may be changed [15 0.6%]
pc136 pilot selection may be more difficult [2 0.1%]
pc132 older pilots may be less accepting of automation [16 0.7%]
pc13 job satisfaction may be reduced [9 0.4%]
pc116 automation may be overemphasized in pilot evaluation [1 <0.1%]
pc89 new tasks and errors may exist [31 1.3%]

U1.1.5: Pilots may be out of the control loop when they use automation. [51 2.1%]
pc2 pilots may be out of the loop [51 2.1%]

U1.1.6: Pilots may place too much confidence in automation. [160 6.6%]
pc131 pilots may be overconfident in automation [26 1.1%]
pc3 pilots may become complacent [100 4.1%]
pc90 pilots may be uncritical of automation actions [22 0.9%]
pc26 pilots may be reluctant to assume control [12 0.5%]

U1.1.7: Pilots may abdicate responsibility to automation. [2 0.1%]
pc163 pilots may abdicate responsibility to automation [2 <0.1%]

U1.1.8: Pilots may use automation when they should not. [61 2.5%]
pc106 pilots may overrely on automation [61 2.5%]

U1.1.9: Pilots may not place enough confidence in automation. [67 2.8%]
pc46 pilots may lack confidence in automation [30 1.2%]
pc70 false alarms may be frequent [37 1.5%]

U1.1.10: Pilots may not use automation when they should. [8 0.3%]
pc146 pilots may underrely on automation [8 0.3%]

U1.1.11: Pilots of automated aircraft may not acquire or maintain manual skills. [91 3.7%]
pc7 manual skills may not be acquired [1 <0.1%]
pc65 manual skills may be lost [89 3.7%]
pc38 scan pattern may change [1 <0.1%]

U1.1.12: Pilots may lose automation skills if they do not regularly use automation. [2 0.1%]
pc137 automation skills may be lost [2 <0.1%]

U1.1.13: Pilots may experience more fatigue in automated aircraft. [2 0.1%]
pc156 fatigue may be induced [2 <0.1%]

U1.2: The fact that automation is used may lead to problems for airlines. [21 0.9%]

U1.2.1: Airlines may not adequately involve pilots in equipment selection. [2 0.1%]
pc141 pilots may not be involved in equipment selection [2 <0.1%]

U1.2.2: Airline automation policies and procedures may be inadequate. [8 0.3%]
pc101 automation use philosophy may be lacking [3 0.1%]
pc151 procedures may assume automation [1 <0.1%]
pc159 use may be required by company [4 0.2%]

U1.2.3: Airlines may assign two lowautomationtime pilots to a crew. [11 0.5%]
pc142 crew assignment may be inappropriate [11 0.5%]

U2: The use of automation function and logic may lead to problems. [568 23.4%]

U2.1: The use of automation function and logic may lead to problems for pilots. [419 17.3%]

U2.1.1: Pilot workload may be increased by automation. [37 1.5%]
pc164 automation may increase workload [33 1.4%]
pc158 planning requirements may be increased [2 <0.1%]
pc119 information processing load may be increased [2 <0.1%]

U2.1.2: Pilot workload may not be optimized by automation. [46 1.9%]
pc79 automation may adversely affect pilot workload [46 1.9%]
U2.1.3: Pilots may focus too much attention on automation. [198 8.2%]
pc102 automation may demand attention [174 7.2%]
pc5 monitoring requirements may be excessive [24 1.0%]

U2.1.4: Pilots may have difficulty with automation complexity. [122 5.0%]

U2.1.4.1: Pilots may not understand automation adequately. [101 4.2%]
pc105 understanding of automation may be inadequate [85 3.5%]
pc41 automation interaction may be misunderstood [16 0.7%]

U2.1.4.2: Pilots may have difficulty deciding how much automation to use. [4 0.2%]

U2.1.4.3: Pilots may make mode selection errors. [17 0.7%]
pc145 mode selection may be incorrect [17 0.7%]
U2.1.5: Pilots may have difficulty transitioning between automated and conventional
aircraft. [16 0.7%]
pc130 transitioning between aircraft may increase errors [16 0.7%]

U2.2: The use of automation function and logic may lead to problems for airlines. [149 6.1%]

U2.2.1: Airlines may not provide adequate nonautomated operations training. [13 0.5%]
pc63 deficiencies in basic aircraft training may exist [13 0.5%]

U2.2.2: Airlines may not provide adequate automation training. [107 4.4%]
pc113 training requirements may neglect automation [40 1.6%]
pc67 training philosophy may be lacking [7 0.3%]
pc118 automation management training may be lacking [1 <0.1%]
pc133 training may be inadequate [53 2.2%]
pc143 instructor training requirements may be inadequate [2 <0.1%]
pc129 transitioning between aircraft may increase training requirements [4 0.2%]

U2.2.3: Airlines may not keep automation databases up to date. [29 1.2%]
pc110 database may be erroneous or incomplete [29 1.2%]

U3: The use of pilot/automation interfaces may lead to problems. [205 8.4%]

U3.1: Pilot situation awareness may be reduced by automation. [53 2.2%]
pc114 situation awareness may be reduced [53 2.2%]

U3.2: Pilots may have difficulty assessing automation failures. [60 2.5%]
pc25 failure assessment may be difficult [60 2.5%]
U3.3: Crew coordination may be more difficult in automated aircraft. [92 3.8%]
pc84 crew coordination problems may occur [51 2.1%]
pc72 cross checking may be difficult [14 0.6%]
pc104 pilot control authority may be diffused [11 0.5%]
pc139 interpilot communication may be reduced [4 0.2%]
pc75 PF may help PNF program automation [12 0.5%]
Top of Page  

Perceived Flightdeck Automation Problems and Concerns

Our study of flightdeck automation yielded a list of 114 perceived problems and concerns, which are presented below in numerical order (our taxonomies -- standard or alternate -- present them in more logical orders). Each entry consists of an identification code (ID) for the P/C, a short descriptive text of the P/C, a complete statement of the P/C, and a representative citation (see the Phase 1 Report).

The citations are largely verbatim. In particular, citations from Aviation Safety Reporting System (ASRS) incident reports contain the authors’ original abbreviations and are presented in all upper-case letters, as they appear in the ASRS database. In the cases of citations from questionnaires, we have identified the sources only generically (e.g., B757 captain, avionics engineer, aviation safety analyst).

(Note: This list is appended to the taxonomy for efficiency reasons. Although placing it in the same file increases the file size, it reduces the number of file transfers necessary to accomodate cross-referencing.)

pc2 pilots may be out of the loop

Pilots may be out of the control loop and peripheral to the actual operation of the aircraft and therefore not prepared to assume control when necessary.

"...the captain allowed himself to remain removed from the 'control loop' by leaving the autopilot engaged." (NTSB, 1986)

pc3 pilots may become complacent

When automation functions reliably, as it does most of the time, it may induce pilots to be less alert in monitoring its behavior, and less prepared to take immediate action when needed.

"...one disturbing side of automation has appeared i.e. a tendency to breed inactivity or complacency." (Wiener and Curry, 1980)

pc5 monitoring requirements may be excessive

Pilots are required to monitor automation for long periods of time, a task for which they are perceptually and cognitively illsuited, and monitoring errors may be likely.

"THESE WHIZ BANG COMPUTERS AND FLT MGMNT SYS ARE GREAT, BUT YOU NOT ONLY HAVE TO WATCH THEM LIKE A HAWK, THEY ARE A ERROR WAITING TO SPRING." (ASRS #148853)

pc7 manual skills may not be acquired

Lowtime pilots assigned to advanced technology aircraft may not acquire manual flying skills, which are still required.

"'Standard' approaches are rarely made on the Air Inter network (about three per pilot per year), and pilots receive much less training in these procedures[manual] than for ILS approaches. On this aircraft, the requirement to qualify on type is three VOR approaches and one NDB approach per crew. A VOR approach is required for the final check. Finally, the Air Inter route conversion instruction manual ... recommends to instructors that a VOR/NDB approach or an ILS approach without glide is practised each time it is compatible with the airport traffic. Statistics of some twentyfive trainees show that each trainee practises only five or six VOR or NDB approaches before entering airline service." (Ministere de L'Equipement, des Transports et du Tourisme, 1993)

pc9 information integration may be required

Pilots may need to integrate information spread over several parts of the interface, possibly creating additional pilot workload.

"There is quite enough information in current cockpits, and not enough integration despite advances in recent years." (Billings, 1991, p. 49)

pc11 automation integration may be poor

The lack of integration of automation systems may increase pilot workload.

"ROUTE WAS DOWNLINKED THROUGH ACARS INTO FMC BY THE CAPT IN OPERATIONS. THE ACARS PRINTER RECEIVED THE ROUTE ALTHOUGH IT DID NOT PUT DATA IN FMC. I MANUALLY INSERTED THE ROUTE. AFTER PASSING PXT ATC ADVISED US THAT WE WEREN'T ON THE PXT 238 DEG R. WE CHECKED THE FLT PLAN AND WE WERE FILED THAT WAY, HOWEVER, THE ROUTE SENT BY ACARS SENT US DIRECT TO HPW AFTER PXT. CENTER THEN CLRED US DIRECT HPW. THIS SITUATION COULD HAVE BEEN AVOIDED IF I HAD BEEN MORE THOROUGH IN CHECKING THE FMC AGAINST THE FLT PLAN." (ASRS report number 98662)

pc12 pilots have responsibility but may lack authority

Automation design may limit the authority of a pilot to perform a function even though he/she still has responsibility for it.

"Too much control taken away from the pilot. I believe automation should assist, not replace the pilot." (B747400 first officer)

pc13 job satisfaction may be reduced

Automation may reduce challenges that are the source of job satisfaction, which may adversely affect pilot performance.

"One threat posed by advanced automation is that it may make things too simple and may remove from flying the challenges that are the source of much of the egogratification and job satisfaction that the profession now offers to pilots..." (Billings, 1991, p. 65)

pc14 information overload may exist

Large amounts and/or poor formatting of information may increase pilot workload.

"Advances in technology now make it possible to generate and display, in an unlimited variety of formats, much more information than the human operator can assimilate and interpret." (Air Transport Association of America, 1989)

pc15 protections may be lost though pilots continue to rely on them

Reversion to lower levels of automation may disable builtin protections, possibly leading to unsafe conditions if pilots continue to rely on them.

"...and such flight path alterations are often more easily accomplished by reverting to a lower level of automation...this in itself may be a problem, because some of the protection provided by the fully automated configuration may be removed by such reversion." (Billings, 1991, p. 29)

pc22 communication between computers may be unsupervised

There may be potential hazards caused by computertocomputer communication when human supervision and intervention is difficult or impossible.

"Much more important is the hypothetical (at this time) situation that would arise if it were decided that clearances arrived at by computertocomputer negotiation need not require pilot or controller consent ..." (Billings, 1994, p. 196)

pc23 failure recovery may be difficult

When automation fails, pilots may have difficulty taking over monitoring, decision making, and control tasks.

"Two occasions going into Gatwick there were frequent reprogramings on the descent. The CDU went blank showing ‘FMS’ indicating the Flight Management Computers had gone out of synch and were in the process of reinterrogating each other. This takes a couple of minutes and requires the pilot navigate horizontally and Vertically by reference to the charts and raw data during this particularly busy time. When the computers come back they must be reprogramed and checked if they are to be used for the remainder of the arrival." (aviation safety analyst)

pc24 failure modes may be unanticipated by designers

Some possible failures may not be anticipated by designers so there are no contingency procedures provided to pilots, possibly increasing troubleshooting workload and the opportunity for error.

"Some of the potential issues arising around the new, automated aircraft include: ... 8. Introduction of unanticipated failure modes." (Air Transport Association of America, 1989, p. 5)

pc25 failure assessment may be difficult

It may be difficult to detect, diagnose, and evaluate the consequences of automation failures (errors and malfunctions), especially when behavior seems 'reasonable', possibly resulting in faulty or prolonged decision making.

"The difficulty lies in the lack of awareness on the part of the crew that an abnormal condition exists, if the onboard computers are compensating without the informing the crew." (Wiener, 1993, p. 34)

pc26 pilots may be reluctant to assume control

Pilots may be reluctant to assume control from automation. When automation malfunctions, this may lead to unsafe conditions.

"... some pilots remain reluctant to interfere with automated process, in spite of some evidence of malfunction." (International Civil Aviation Organization, 1992, p. 15)

pc28 maintenance information may be inaccessible

Maintenance information present in aircraft computers may not be available to pilots, possibly reducing pilot awareness of aircraft status.

"On the 747400 [CSD] synoptic charts are available to the crew while in flight (though some can only be accessed while on theground)." (B737300/500 captain)

pc30 sidesticks may not be coupled

Sidesticks may not be coupled with each other or the autopilot, possibly reducing awareness of the other pilot's or the autopilot's inputs and resulting in reduced situation awareness and/or improper control actions.

"2) The sidesticks aren't connected to each other. I don't know if unusual attitudes are being caused by turbulence or the other pilot." (A320 Captain)

pc32 trend information may be lacking

Automation may give no trend information which would be helpful in predicting future behavior.

" When the light [CSD temperature/pressure] illuminates (in the middle of the North Atlantic) a crew is now confronted with a situation to which they must react and make decisions without any history and very limited information." (B737300/500 captain)

pc37 controls of automation may be poorly designed

Automation controls may be designed so they are difficult to access and activate quickly and accurately, or easy to activate inadvertently.

"...The vertical speed and altitude selection knobs of the flight control unit (FCU) are close to each other, and instead of operating the vertical speed knob , the pilot CM.2 had inadvertently operated the altitude selection knob...the Court has specifically suggested a design change with respect to the two knobs..." (Ministry of Civil Aviation; Government of India, 1990, p. iv)

pc38 scan pattern may change

Display layout in automated flightdecks may change the traditional instrument scan pattern, possibly leading to loss of skills which may be needed upon transitioning to conventional aircraft.

"The only problem I can think of is dependence on it. A pilot's scan tends to slow down and narrow." (B737 captain)

pc39 interface may be poorly designed

The pilotautomation interface may be poorly designed with respect to human factors considerations, possibly resulting in poor pilot performance or pilot dissatisfaction.

"BECAUSE OF ACFT DESIGN I AM NOT ABLE TO SEE THE HORIZONTAL SITUATION INDICATOR WHEN PROPERLY SEATED AND ALIGNED AS IT IS POSITIONED BEHIND THE CONTROL COLUMN." (ASRS report number 60408)

pc40 automation may be too complex

Automation may be too complex to understand and use effectively.

"... as system complexity increases, and depending on the feedback mechanism available, predicting the system's behavior in context may be much more difficult." (Sarter and Woods, 1994, p. 23)

pc41 automation interaction may be misunderstood

Pilots may not effectively use automation because they do not understand interaction of automation devices or interaction of automation modes.

"It [the A320 accident at Bangalore] would seem to be a classic case of an interactively complex system, which the crew, despite their training, partly misunderstood, together with a counterintuitive logic of system interconnection and inadequate interface." (Mellor, 1994, p. 41)

pc43 disengagement may be impossible

Pilots may not be able to disengage automation, possibly resulting in limits to pilot authority.

"...the crew was unable to override the [braking system] lockout and to operate ground spoilers and engine thrust reversers." (Main Commision Aircraft Accident Investigation, 1994, p. 40)

pc44 mode transitions may be uncommanded

Automation may change modes without pilot commands to do so, possibly producing surprising behavior.

"As identified in recent research, unanticipated mode changes are a concern, particularly when transitioning from climbing/descending to level flight." (B757 captain)

pc45 modes may proliferate

The proliferation of automation modes may increase pilot decision making requirements and result in infrequently used and poorly understood modes.

"Modes proliferate as designers provide multiple levels of automation and various optional methods for many individual functions. The result is numerous mode indications distributed over multiple displays each containing just that portion of the mode status data corresponding to a particular system or subsystem." (Sarter and Woods, in press, p. 4)

pc46 pilots may lack confidence in automation

Pilots may lack confidence in automation due to their experience (or lack thereof) with it. This may result in a failure to use automation when it should be used.

"I AM BEGINNING TO DISTRUST THE ALT ARM MODE OF THE AUTOPLT TO THE POINT WHERE I'D RATHER FLY MOST APCHES MANUALLY!" (ASRS report number 62983)

pc47 data access may be difficult

It may be difficult to access data "hidden" in the architecture of the automation system, possibly increasing pilot workload.

"The opaqueness of the system also appears in the page architecture of the CDU which makes it difficult to find all relevant data." (Sarter, 1991, p. 1309)

pc49 data reentry may be required

Data entries may not propagate to related functions in automation devices. The same data may have to be entered more than once.

"One example I can think of is the need to reenter winds for a return to the airport immediately after takeoff. With data link, there will be a lot of examples." (Riley, 1995)

pc51 feedback may be poor

Automation may present little or no feedback, or the feedback may be inappropriate or misleading. As a result, pilots may be unaware of the mode (state) and behavior of the automation.

"...the culprit is not actually automation, but rather lack of feedback." (Norman, 1990, p. 141)

pc53 vertical profile visualization may be difficult

It may be difficult for pilots to visualize vertical profiles based on alphanumeric displays. This difficulty may increase pilot workload when flying, or planning to fly, these profiles.

"However, the commission is of the opinion that the presentation of information concerning guidance and piloting in the vertical plane, whilst meeting the needs of a crew fully aware of the flight path it is following, is not capable of providing adequate and effective warning to a crew with an erroneous perception in this regard, and especially when some type of conventional sources of sensorial feedback are absent on this type of aircraft." (Ministere de l'Equipement, des Transports et du Tourisme, 1993, p. 239)

pc55 manual operation may be difficult after transition from automated control

In some situations flight control may be difficult after transition from automated to manual flight.

"...The captain lost control of the airplane when, after disengaging the autopilot, he failed to make the proper flight control corrections to recover the airplane." (NTSB, 1986, p. 34)

pc58 envelope protections may limit pilot authority

Envelope protections may prevent necessary correction maneuvers in critical situations, such as when recovering from unusual attitudes.

"On board computers that override a pilot's input ... An unusual or abrupt control manuever may avoid an accident and potential loss of life." (B737300 captain)

pc63 deficiencies in basic aircraft training may exist

Training for automated aircraft may not adequately prepare pilots with basic (i.e., nonautomation) system knowledge in that aircraft, and pilots may lack the knowledge and skill necessary to operate the aircraft manually.

"THIS IS A PRODUCT OF CURRENT TRNING DOCTRINE, WHICH IS HEAVILY WEIGHTED WITH EMPHASIS ON AUTOFLT OF THE ACFT. MORE AND MORE OF THE BASIC STICK AND RUDDER FLYING IS BEING DELEGATED TO AUTOFLT SYSTEMS, WHICH IN TURN IS SETTING THE STAGE FOR MORE OF THIS TYPE OF INCIDENT." (ASRS report number 135982)

pc65 manual skills may be lost

Pilots may lose psychomotor and cognitive skills required for flying manually, or for flying nonautomated aircraft, due to use of automation.

"Some crews tend to rely too much on automation once familiar & confident in the automation they may not back up the computers with manual crosschecking they lose proficiency in basic flying skills." (B737 captain)

pc67 training philosophy may be lacking

There may be no effective philosophy for training pilots for automated aircraft, possibly resulting in inappropriate and inadequate training.

"Training specialists have no place to turn for guidance on the question of training for automation........HF profession......has not been forthcoming much in the way of guidelines and assistance." (Wiener, 1989, p. 59)

pc70 false alarms may be frequent

Frequent false alarms may cause pilots to mistrust or ignore automation and therefore not use it or respond to it when they should.

"Poorly engineered systems on the A320 cause confusion and false alarms, false failures and mislead crews as to the actual condition of their aircraft. When a warning first appears on the A320, high time crews are conditioned to believe it's another false alarm." (B737 captain with A320 flight experience)

pc71 data entry errors on keyboards may occur

Keyboard alphanumeric data entry may be prone to errors, which may adversely affect safety.

"I INADVERTENTLY ENTERED MGM INTO THE FMC, RATHER THAN MGW." (ASRS report number 120705)

pc72 cross checking may be difficult

It may be difficult for one pilot to monitor what another is doing with automation, possibly reducing awareness of pilot intentions and cross checking for errors.

"ALSO, MODELS HAVE SEVERAL SETS OF COCKPIT CONFIGN WITH MCP SWITCHES BEING IN DIFFERENT PLACES SOME IN SUCH A LOCATION THAT PF MAY NOT SEE PNF MOVE THEM AND THEY HAVE NO INDICATION ON THE MODE ANNUNCIATORS OR HSI THAT INDICATES SWITCH HAS BEEN MOVED. RPTR FEELS THAT MODEL WITH 1 FMC AND ELECTRIC MECHANICAL INSTRUMENTS IS AN ACCIDENT WAITING TO HAPPEN BECAUSE OF LACK OF INDICATIONS ON INSTRUMENT PANEL WHAT FMC IS DOING." (ASRS report number 98109)

pc75 PF may help PNF program automation

The PF may help the PNF when programming problems occur, possibly diverting attention from safetycritical tasks.

"INSTEAD OF DISENGAGING THE PMS AND MAKING MANUAL INPUTS TO THE AUTOPLT, I ATTEMPTED TO ASSIST THE F/O IN REPROGRAMMING THE PMS TO A CORRECT DESCENT PROFILE. AT ANY RATE, THE BOTTOM LINE IS THAT DURING THE TIME WHILE MY ATTN WAS FOCUSED ON HELPING MY F/O WITH A SYSTEM HE WAS UNFAMILIAR WITH (AND USING FOR THE FIRST TIME), I NEGLECTED TO REALIZE THAT WE WERE TOO HIGH AT THE FILLMORE VOR." (ASRS report number 63592)

pc79 automation may adversely affect pilot workload

Automation may increase pilot workload at high workload times and reduce pilot workload at low workload times, possibly resulting in excess workload and boredom.

"...workload seemed to be reduced when it was not heavy or critical, may be increased by automation when it was already heavy or critical." (Wiener, 1989, p. 20)

pc82 flightdeck automation may be incompatible with ATC system

There may be incompatibility between advanced automation aircraft and existing ATC system capabilities, possibly increasing pilot workload and leading to unsafe conditions. It may also reduce the pilot's ability to use automation for the best results.

"...problems, not with the cockpit equipment per se, but with the ability of ATC to allow the crew to exploit it." (Wiener, 1989, p. 144)

pc83 behavior of automation may not be apparent

The behavior of automation devices may not be adequately apparent for pilot monitoring, possibly resulting in reduced pilot awareness of automation behavior.

"...since he was not 'hands on,' he was not aware of the deflection [being applied by the autopilot]." (NTSB, 1986, p. 30)

pc84 crew coordination problems may occur

The use of automation may adversely affect crew coordination, possibly leading to unsafe conditions.

"Crew coordination on flight decks which are automated is a problem because one pilot must fly while the other programs the automated systems." (B747 captain)

pc87 data may be too abstract

Data presented in integrated/abstracted/simplified forms may not fully support effective pilot decision making.

"As with flight directors, it is not difficult to lose sight of the raw navigation data. Map displays do not make it particularly easy to evaluate the raw data from which position is derived." (Billings, 1991, p. 37)]

pc89 new tasks and errors may exist

Automation may change and/or add pilot tasks, possibly making new (often more serious) errors possible.

"Technology change creates potential for new kinds of error and system breakdown as well as changing the potential for previous kinds of trouble." (Woods et al, 1994, p. 113)

pc90 pilots may be uncritical of automation actions

Pilots may accept or approve automation recommendations and actions without critical review, possibly resulting in errors.

"There is a tendency for crews to rely on computer generated information and not cross check using raw data information." (B747 first officer)

pc92 displays may be poorly designed

Displays may not be designed for adequate visibility, legibility, and readability, possibly resulting in misinterpretation of display information.

"FMAs (Flight Mode Annunciations) are cryptic and not well presented." (B737300/500 captain)

pc94 data entry format may be inflexible

There may be only one form in which data can be entered for system acceptance; other entries may be treated as errors. This may increase pilot workload.

"Logic errors usually involved the flight crew entering data in a format or form that the FMC would not recognize" (Eldredge et al, 1992, p. 41)

pc95 mode awareness may be lacking

Pilots may not be able to tell what mode or state the automation is in, how it is configured, what it is doing, and how it will behave. This may lead to reduced situation awareness and errors.

"Hypothesis No. 1:...the abnormally high rate of descent was the result of an unintentional command on the part of the crew because they believed the vertical mode selected on the autopilot to be other than that which was actually selected." (Ministere de l'Equipement, des Transports et du Tourisme, 1993, p. 204)

pc96 data input prompts may be poor

Prompts for data input may not indicate the correct format, possibly increasing pilot workload and making errors likely.

"...screen prompts are not always clear, when they are available." (Billings, 1991, p. 54)

pc99 insufficient information may be displayed

Not enough information may be provided for pilots to adequately perform their tasks.

"Not enough information is presented to the crew for them to make timely, informed decisions or corrective actions." (B737300/500 captain)

pc100 humancentered design philosophy may be lacking

Automation design may not be guided by a philosophy that gives adequate attention to the proper role and function of the human and to human capabilities and limitations. This may compromise system effectiveness and safety.

"Other costs have accrued because today's automated systems are not optimally designed to work cooperatively with their human operators." (Billings, 1994, p. 62)

pc101 automation use philosophy may be lacking

There may be no comprehensive, coherent philosophy provided to pilots for the use of automation, possibly resulting in inconsistencies and uncertainties in its use.

"In their operation of twoperson crew aircraft, many operators have not reflected [through revised procedures] the advances that have been made in flight deck technology and in the understanding of flight crew behaviour." (International Civil Aviation Organization, 1992, p. 24)

pc102 automation may demand attention

The attentional demands of pilotautomation interaction may significantly interfere with performance of safetycritical tasks. (e.g., "headdown time", distractions, etc.)

"The FMS workload was high ... crew could have lost awareness..." (Aircraft Accident Investigation Committee, 1992, p. 117)

pc103 automation level decisions may be difficult

It may be difficult to decide what levels of automation are appropriate in specific circumstances, possibly increasing pilot workload.

"The real world events change the plan. Then the decision has to be made whether to disconnect the automatic systems and go to raw data or to reprogram. " (former B757/767 captain)

pc104 pilot control authority may be diffused

The traditional distribution of workload between pilots (e.g., between PF and PNF, between C and F/O) may be modified under automated flight, possibly allowing safetycritical tasks to be neglected.

"The modern cockpit seems to produce a redistribution of authority from the captain to the first officer." (Wiener, 1989, p. 178)

pc105 understanding of automation may be inadequate

Pilots' understanding of automation may be inadequate for the performance of their duties.

"The flightcrew was not thoroughly knowledgeable of the aircraft's flight guidance and control system." (NTSB, 1980, p. 23)

pc106 pilots may overrely on automation

Pilots may use automation in situations where it should not be used.

"The National Transportation Safety Board determines that the probable cause of this accident was the flightcrew's ... overreliance on the autothrottle speed control system which had a history of recent malfunctions." (NTSB, 1984, p. 47)

pc107 workarounds may be necessary

Pilots may have to use automation in a manner not intended by designers to get desired results or to avoid undesirable consequences, possibly increasing pilot workload and opportunity for error. This may have unanticipated and undesirable side effects.

"The situation described above for the 757 results in missed crossing restrictions on virtually every descent ! Error can range from 50 to 200 feet and 10 to 30 knots. Many pilots compensate by building a 2 or more mile ‘pad’ into the LNAV course, i.e. creating a waypoint ahead of the crossing restriction to reach the altitude early." (B757 captain)

pc108 automation behavior may be unexpected and unexplained

Automation may perform in ways that are unexpected and unexplainable by pilots, possibly creating confusion, increasing pilot workload to compensate, and sometimes leading to unsafe conditions.

"FOR SOME REASON THE ALT HOLD DID NOT CAPTURE AS IT SHOULD HAVE AND WE LOST 350' OF ALT BEFORE IT WAS CORRECTED." (ASRS report number 50829)

pc109 automation may lack reasonable functionality

Automation design may prevent the device from performing a function that seems reasonable to the pilot, possibly requiring the use of alternative strategies which may increase workload and the opportunity for error.

"THE FMS WAS ANOTHER PROB, ONCE THE ILS TO RWY 11 WAS PUT INTO THE COMPUTER, THE COMPUTER WOULD NOT ACCEPT A PREAPCH INBND HOLD AT OR HAM BECAUSE IT WAS ALREADY IN THE COMPUTER AS THE MISSED APCH HOLDING FIX. WE HAD NEVER ENCOUNTERED THIS PROB BEFORE." (ASRS report number 160101)

pc110 database may be erroneous or incomplete

When the automation system database is incomplete or contains erroneous data, it may increase pilot workload and the opportunity for navigation or other errors.

"FAILED TO NOTICE DURING ARWY

PROGRAMMING THAT FMC DATA BASE WAS NOT CURRENT DUE TO MASSIVE E COAST ATC PROGRAM CHANGE EFFECTIVE 3/88." (ASRS report number 84480)

pc112 programming may be difficult

Procedures for programming automation may be overly difficult and complex. This may cause errors and delays that may lead to unsafe conditions.

"... The CVR shows that the crew spent considerable time and effort in inputting the ROMEO fix and the Simara NDB into the Flight Management Computer." (Aircraft Accident Investigation Committee, 1992, p. 118)

pc113 training requirements may neglect automation

Training and checking requirements may not necessarily take into account new flightdeck automation capabilities and, as some pilots do not get the necessary training, they may lack necessary knowledge and skills.

"My major potential concern addresses the skills & abilities used to select & train pilots, & their relevance to the actual skills & abilities employed on highly automated aircraft. I believe that selection criteria & training curricula were developed for an earlier era & may not be pertinent to contemporary aircraft. ... New technology aircraft require skills that may be very different than those that underlie current pilot training programs." (aviation safety analyst)

pc114 situation awareness may be reduced

Reliance on automation may reduce pilots' awareness of the present and projected state of the aircraft and its environment, possibly resulting in incorrect decisions and actions.

"...since the autopilot effectively masked the approaching onset of the loss of control of the airplane." (NTSB, 1986, p. 32)

pc115 testing may be inadequate

Automation may not be thoroughly tested before use and therefore not perform correctly under certain conditions.

"I think the designers went too far without adequate pilot input (flexibility of function) and didn't do enough testing (hence all the flaws, which are slowly being weeded out)." (A320 captain)

pc116 automation may be overemphasized in pilot evaluation

In pilot evaluation there may be an overemphasis on automation skills to the extent that manual and nonautomationrelated cognitive skills are minimized. Pilots may therefore lack nonautomated operations skills.

"Some pilots have complained that airlines may be substituting FMS and other automation training for some of the manual flight maneuvers that may be required to respond to emergencies." (Riley, 1995)

pc117 function allocation may be difficult

Automation designers may have difficulty in making good decisions about allocating functions to humans or to automation, possibly leading to poor function allocation decisions.

"There's a pretty extensive literature devoted to function allocation, and designers do have problems with it. The biggest problem is simply not doing it; just automating everything possible." (Riley, 1995)

pc118 automation management training may be lacking

Flightcrews may be trained in automation operation procedures but not in automation management, which can be critical for safe operations.

"Training of flightcrews in the MANAGEMENT of automated systems. Current training seems to only address the linear, procedural aspects of the use of automation. Operational experience indicates that this equipment is used in a highly nonlinear manner." (pilot and flight training professional)

pc119 information processing load may be increased

Information about the mode (state) and behavior of the automation itself may add to the pilot's information processing load, possibly resulting in increased workload and opportunities for error.

"It should be recognized that automation, which enables more information to be presented, also carries with it costs in terms of the amount of information required to monitor the automated functions..." (Billings, 1991, p. 33)

pc120 automation operation may be based on few variables

Automation recommendations or actions may be based on only a few key variables. An automation device lacks full situation awareness and may advise or act inappropriately, especially under unusual conditions.

"A Fokker F.100 provided a demonstration of such modality problems in November 1991. While attempting to land at Chicago O'Hare, the crew was unable to apply braking. Both the air and ground switches on its landing gear were stuck in the air position. Since the computers controlling the braking system thought the plane was in the air, not only was the crew unable to use reverse thrust, but, more controversially, they were also unable to use nosewheel steering or main landing gear brakingservices that would have been available on most other airliners in a similar situation." (Neumann, 1995, p. 43)

pc121 operational knowledge may be lacking in design process

Automation design may not take into consideration the operational knowledge of pilots. This may lead to designs that are counterintuitive to pilots, possibly increasing pilot workload and the opportunity for error.

"Engineers analyzing the requirements, reviewing the design, designing tests, and reviewing test results must be familiar with theories and procedures for flying aircraft. Something seemingly subtle and insignificant to a software engineer can cause significant procedural errors or make a pilot crazy." (avionics engineer)

pc122 automation may use different control strategies than pilots

Automatic flight control systems may use a different strategy of control than the pilot, possibly leading to the pilot's loss of situation awareness and pilot errors.

"... the logic which requires Airbus Industrie aircraft stabalisers to move to counteract pilotelevator input when the autopilot is engaged with GA or automaticlanding modes selected seems 'unnatural' and, therefore, potentially confusing." (Learmount, 1995, p. 34)

pc123 inadvertent autopilot disengagement may be too easy

It may be too easy for the pilot to disengage the autopilot. When this happens, control may be lost.

"I'M CONVINCED IN EFFORT TO COMPLY WITH SPEED CHANGE F/O MOVED IAS CONTROL WHEEL WHILE IN ALT CAPTURE PHASE, CAUSING ARMED ALT TO 'DROP OFF'. COMMON GLITCH WITH THIS AUTOPLT SYSTEM. WE'RE ALL AWARE OF IT, BUT IT CONTINUES TO CATCH US OCCASIONALLY." (ASRS report number 68232)

pc124 automation may be too complex and tightly coupled

Computerbased automation may make the aircraft more complex and tightly coupled (i.e., there are many components, they are strongly linked, and failures may propagate more readily). This may increase the likelihood of accidents.

"... the use of computers leads to interactively complex, tightly coupled systems, often where these features were not originally present and not necessary for the operation of the systems. They therefore contribute to 'system accidents'." (Mellor, 1994, p. 9)

pc125 design specifications may be inadequate

Although automation may do what it is designed to do, design specifications may not take into account certain unlikely but very possible conditions, possibly leading to unsafe automation behavior.

"..aircraft safety systems which worked according to their specifications ended up contributing to a fatal accident.." (Wiener, 1985, p. 42)

pc126 automation performance may be limited

The ability of the automation to perform correctly and quickly may be limited by design constraints, possibly increasing pilot workload and the opportunity for error.

"The FMS can be programmed to accept a series of waypoints and altitude restrictions which the FMS can not keep up with in terms of recalculation time, standard rate turns, flying by rather than flying over waypoints and undesired sequencing due to proximity to waypoints as the flight path passes in the outbound legs." (Wise et al, 1993, p. 231)

pc127 commercial incentives may dominate

The fundamental reasons for the increasing use of automation may be to decrease the cost of flight operations. If these considerations receive excessive emphasis in the aircraft design process, safety may be compromised.

"The fundamental reasons, for the increasing use of programmable systems are commercial ... The problems arise when commercial pressure becomes such that the dependability of the delivered system is compromised." (Mellor, 1994, p. 62)

pc128 complex automation may have overly simplistic interface

The necessary simplicity of the pilotautomation interface may hide important complexities, possibly leading to unexpected behaviors and difficulty performing complex operations.

"Such systems appear on the surface to be simple because they lack large numbers of physical display devices and controls; however underneath the placid surface of the CRT workstation there may be a variety of characteristics which produce cognitive burdens and operational complexities." (Woods, 1994, p. 3)

pc129 transitioning between aircraft may increase training requirements

Transitioning back and forth between advanced technology aircraft and conventional aircraft may increase pilot training requirements.

"[Deficiencies in] transition training between types manual to automatics e.g. B737200 to B767, B747200 to B747400; automatics to automatics B747400 to B767." (B737 first officer)

pc130 transitioning between aircraft may increase errors

Transitioning back and forth between advanced technology aircraft and conventional aircraft may cause problems such as erosion of aircraftspecific skills, possibly leading to pilot errors.

“CONTRIBUTING FACTORS WERE THE CONFIGN OF SHIP XXX. IT HAS NO FMS LIKE OUR OTHER WDB ACFT, SO IT HAS NO CRUISE AUTOTHROTTLE CAPABILITY. THIS WAS THE FIRST TIME THAT I FLEW THIS SHIP AND THE FIRST IN A TRIPLE INS CONFIGN. ALL PAST EXPERIENCE HAS BEEN INS/FMS CONFIGN. I LET MYSELF BECOME DISTRACTED AND UPON SEEING LOW AIRSPD ADVANCED THROTTLES RAPIDLY." (ASRS report number 231227)

pc131 pilots may be overconfident in automation

Pilots may become overconfident in the capabilities of automation and fail to exercise appropriate diligence, possibly leading to unsafe conditions.

"The A320 has some new features which may have inspired some overconfidence in the mind of the Captain." (Investigation Commission, 1989, p. 56)

pc132 older pilots may be less accepting of automation

Older pilots may have trouble accepting and learning to use automation, possibly making them more prone to misusing it.

"Training ‘old’ pilots to use modern technology [is a problem]." (A320 first officer)

pc133 training may be inadequate

Training objectives, methods, materials, or equipment may be inadequate to properly train pilots for safe and effective automated aircraft operation.

"Lack of adequate software training ..." (B757 first officer)

pc134 software versions may proliferate

Different software versions running on the FMSs of different airplanes may create difficulty for pilots to use them safely and effectively.

"A large number of reports is related to the fact that different versions are simultaneously running on the FMS which sometimes makes it difficult for pilots to communicate their intentions to the system as they are not sure about the required data entry format." (Sarter, 1991, p. 1308)

pc136 pilot selection may be more difficult

The presence of automation may make pilot selection more difficult, possibly resulting in the selection of pilots not suited or not adequately prepared for their jobs.

"Some contend that automation will lead to less concern for crew selection. In reality, more attention will have to be devoted to selection procedures because of automation in advanced flight decks." (International Civil Aviation Organization, 1992, p. 17)

pc137 automation skills may be lost

Prolonged absence from advanced technology aircraft may result in a loss of automation use skills, possibly resulting in poor pilot performance when pilots return to advanced technology aircraft.

"It's harder to maintain proficiency with continued reliance on automation. [But] If you turn it off and fly then you lose your automation proficiency." (B747400 captain)

pc138 standardization may be lacking

There may be a lack of function and interface standardization between automation systems, possibly leading to increased training requirements, increased pilot workload, and poor pilot performance.

"I BELIEVE THE PRIMARY CONTRIBUTING FACTOR WAS THE DIFFERENCE IN AUTOFLT SYSTEMS IN THE SAME ACFT." (ASRS report number 138909)

pc139 interpilot communication may be reduced

The presence of automation may reduce interpilot communication, possibly resulting in less sharing of information.

"...there is a trend toward lower interpilot communication as the degree of cockpit automation increases..." (Wiener, 1989b, p. 3)

pc140 printed media may be inadequate

Some companies may not provide high quality printed media related to automation, such as manuals and checklists, possibly leading to poor pilot performance.

"Poorly organized, difficult to read manuals (FCOM 1,2 &3) ..." (A320 captain)

pc141 pilots may not be involved in equipment selection

Pilots may have little or no input when their companies select automation. This may result in failure to adequately consider pilot needs in the selection process.

"... the tragedy of ... Strasbourg ... could have been avoided if the aircraft had been equipped with a ground proximity warning system (GPWS)...This instrument ... had experienced ... problems ... caused Air Inter's technical department to delay their decision...and a number of pilots requested that the instrument be fitted to their aircraft." (Stefanovich and Thouanel, 1993)

pc142 crew assignment may be inappropriate

When two pilots with little automation experience are assigned to an advanced technology aircraft, errors related to automation use may be more likely.

"... the Commision noted:... the limited amount of experience acquired by both pilots on this type of aircraft, and the absence of regulations or national/international recommendations on this subject;..." (Ministere de l'Equipement, des Transports et du Tourisme, 1993, p. 316)

pc143 instructor training requirements may be inadequate

Automation training requirements for instructor/check pilots may not be well defined, possibly leading to inadequately qualified instructor/check pilots.

"Lack of adequate software training (understanding by instructors also)." (B757 first officer)

pc144 pilot's role may be changed

Automation may change the role of the pilot from that of a controller to that of a supervisor. Because most pilots are not adequately trained for and experienced in this role, errors may result.

"In these systems, the pilots' role has changed from active manipulator of the aircraft to supervisor of the automated systems." (Sarter and Woods, 1992, p. 17)

pc145 mode selection may be incorrect

Pilots may inadvertently select the wrong automation mode for unknown reasons, possibly causing the automation to behave in ways different than intended or expected.

"The autopilot system was in a vertical speed mode rather than an airspeed or mach command mode during the climb contrary to AEROMEXICO's procedures and contrary to the manufacturer's prescribed normal operating procedures and recommendations." (NTSB, 1980, p. 22)

pc146 pilots may underrely on automation

Pilots may not use automation when they should, possibly leading to unsafe conditions or reduced operating efficiency.

"The Safety Board concludes that if the autopilot had not been disengaged until the minimum authorized altitude ... the aircraft would have reached the runway safely." (NTSB, 1973, p. 6)

pc147 pilots may misunderstand automation intent

Pilots may not know the automation's intent, possibly allowing conflicting goals to go undetected and unsafe conditions to ensue.

"It is not unusual to see a pilot 'fighting' the automatics, especially autothrottles: he pulls them back, they advance, he pulls them back again and they advance once more. Obviously there is a goal conflict." (Curry, 1985, p. 34)

pc148 traffic coordination requirements may increase

Automation has freed the crew of newer aircraft from dependence on surface systems of navigation aids, but this freedom from defined route constraints may increase air traffic coordination requirements and complicate conflict prediction

"Automation, in the form of inertial reference and flight management systems, has freed the crew of newer aircraft from dependence on surface systems of navigation aids, but this freedom from defined route constraints has increased air traffic coordination requirements and has complicated conflict prediction both by pilots and in control facilities." (Billings, 1994, p. Intro13)

pc149 similarity may be superficial

Although the pilot interface may be superficially similar across aircraft types, there may be significant deep differences that may confuse pilots and lead to unsafe conditions.

"While Airbus cockpits do have a very high degree of commonality, the computer architecture and FMS functionality across types are not really identical (though the differences are normally transparent to pilots). I am concerned, however, about the behavior of different software in these types at the margins, and the potential for surprises under difficult circumstances." (Billings, 1994, p. 258)

pc150 automation performance may be reduced at margins of envelope

Automation may work well under nominal conditions but not have the desired behavior close to the margins of its operating envelope, possibly leading to unsafe conditions.

" Improper use of automated settings is a real safety problem, because the system performs as requested, without alarm, until the aircraft is out of the envelope of manual recognition & recovery." (B747 captain)

pc151 procedures may assume automation

Some procedures may be designed under the assumption that automation will be used. If it is not, either by necessity or pilot choice, workload may be excessive and errors more likely.

"Work load & procedures are designed assuming automation will be used. When it is not, (pilots like to hand fly), things get too busy and bad things happen." (B757 captain)

pc152 state prediction may be lacking

Automation displays may show only current state and no estimate of future state. This may prevent pilots from anticipating and preparing for problems.

"We are long overdue in developing systems that can forecast trouble rather than merely waiting for it to occur." (Wiener, 1993, p. 74)

pc153 nonautomated pilot tasks may not be integrated

Automation designers may leave pilots to do the tasks that cannot be automated. The pilots may be left with a set of poorly integrated tasks that are difficult to perform well.

"...as it means that the operator can be left with an arbitrary collection of tasks and little thought may have been given to providing support for them." (Bainbridge, 1987, p. 272)

pc156 fatigue may be induced

Automation may induce fatigue, possibly leading to poor pilot performance.

"Contrary to Airbus Industrie, which claims that the workload of an A320 crew is significantly decreased, Mr. Sons emphasizes that it is far more tiring and stressful to pilot this ultramodern aircraft." (Stefanovich and Thouanel, 1993, p. 9)

pc157 automation may conflict with ATC

Onboard warning and advisory systems may cause conflicts with ATC clearances, possibly leading to loss of separation or other unsafe conditions.

"The primary concern for me (and probably most air traffic controllers) is the TCAS (Traffic Alert and Collision Avoidance System). The electronics have been refined to some extent and the system improved, however, in many instances, the TCAS would tell the pilot to execute a climb or descent contrary to air traffic instructions which has caused a ‘near miss.’" (air traffic controller)

pc158 planning requirements may be increased

Flying an automated aircraft may take more planning than flying a manual aircraft. Pilots may not plan far enough ahead to use automated systems, so safety may be compromised.

"To fly an automated aircraft takes more planning. Few pilots plan far enough ahead to use automated systems. " (B747400 captain)

pc159 use may be required by company

Companies may require pilots to use automation. In some cases, such use may be unsafe.

"It has been suggested that airline management sometimes encourage crews to use automation in situations where crews may feel more comfortable hand flying the aircraft." (Bureau of Air Safety Investigation, 1994, p. 15)

pc160 automation requirements may conflict

Satisfaction of one automation functional or certification requirement may lead to the violation of another.

"...Contradictory autopilot requirements appear as a key factor that contributed to the loss of control...the autopilot also had to simultaneously manage the combination of very low speed, an extremely high angle of attack, and asymmetrical engine thrust..." (Sparaco, 1994)

pc161 automation use may slow pilot responses

When using automation, pilot response to unanticipated events and clearances may be slower than it would be under manual control, possibly increasing the likelihood of unsafe conditions.

"Airspeed/ATC: When A/T engaged, may be delay in responding to speed clearances." (Funk et al, 1995, p. 7)

pc162 auditory displays may be poorly designed

Auditory displays may be poorly designed, possibly leading to increased pilot workload and increased likelihood that safetycritical information may not be correctly perceived.

"A number of audible information inputs we get while operating the A320, while designed with the intent of providing us useful info, have two problems: 1) we cannot ‘cancel’ the aural input after we are aware of it (much like pushing a master caution lite 'out' after understanding the cause) and 2) the volume level is not adjustable and it is too loud. I believe that the whole point of an audio callout is to give me information, but that once I receive this info, I should be able to cancel such input, and I should be able to control/modify/stop the transmission of this info at will." (A320 captain)

pc163 pilots may abdicate responsibility to automation

Pilots may abdicate responsibility to automation, assuming that it will perform correctly and therefore not monitoring it adequately. In some cases, the assumption may be invalid and safety may be compromised.

"I'm also concerned that highly automated aircraft foster an abdication of

responsibility to the automation, and blur the distinction between decision aid and decision maker." (human factors scientist)

pc164 automation may increase workload

Automation may increase pilot workload, possibly leading to unsafe conditions.

"ON A FLT SUCH AS THIS, THE ONLY THING COMPUTERS DO FOR PLTS IS DOUBLE THE WORKLOAD." (ASRS report number 132299)

pc165 cultural differences may not be considered

Cultural differences may not be adequately considered in automation design, training, certification, and operations. If they are not considered, they may have resulting effects on performance and how automation is used.

"My concern [with flightdeck automation] is that cultural differences are not adequately considered in design, training, certification, and operations, and because they are not considered, they have resulting effects on performance and how automation is used." (K.H. Abbott, 1995)

Top of Page  

  Last update: 4 June 2003 Flight Deck Automation Issues Website  
© 1997-2013 Research Integrations, Inc.