FDAI logo   ::  Site Map  ::   
Home  |  About This Website  |  Contact Us
Home » ... » Evidence from Resource

Evidence from Resource 5 pieces of evidence from this resource.

Hourizi, R. & Johnson, P. (2001). Beyond Mode Error: Supporting Strategic Knowledge Structures to Enhance Cockpit Safety.. In A. Blandford, J. Vanderdonkt & P. Gray (Eds.): People and Computers XV - Interaction without frontiers. Joint Proceedings of HCI2001 and ICM2001, Lille, 10-14th Sept. 2001, Springer Verlag, 229-246.

  1.  
  2. Evidence Type: Excerpt from Accident Review Study
    Evidence: Further examples of both selectivity and confirmation bias can be found in the 1988 Air France crash at Mulhouse-Habsheim (Degani et al 1996), where the pilots continued to believe that they could avert disaster by fighting with the plane’s joystick, despite the fact that their actions were not affecting the flight path as expected and in the China Airlines crash during a descent into Nagoya, where the pilots continued to believe that they could safely land the plane using the joystick, whilst a mistaken engagement of full forward thrust made it practically impossible to do so. Both incidents support the view that knowledge gaps lie behind many examples of automation surprise. (page 8)
    Issue: manual operation may be difficult after transition from automated control (Issue #55) See Issue details
    Strength: +1
    Aircraft: unspecified
    Equipment: automation

  3.  
  4. Evidence Type: Excerpt from Accident Review Study
    Evidence: Further examples of both selectivity and confirmation bias can be found in the 1988 Air France crash at Mulhouse-Habsheim (Degani et al 1996), where the pilots continued to believe that they could avert disaster by fighting with the plane’s joystick, despite the fact that their actions were not affecting the flight path as expected and in the China Airlines crash during a descent into Nagoya, where the pilots continued to believe that they could safely land the plane using the joystick, whilst a mistaken engagement of full forward thrust made it practically impossible to do so. Both incidents support the view that knowledge gaps lie behind many examples of automation surprise. (page 8)
    Issue: failure assessment may be difficult (Issue #25) See Issue details
    Strength: +1
    Aircraft: unspecified
    Equipment: automation

  5.  
  6. Evidence Type: Excerpt from Accident Review Study
    Evidence: In order to find the common element which links the two failures, described in the previous section, we re-examined the failures themselves. We found that the combination of “confirmation bias” i.e. a tendency to confirm an existing world view in the face of contradictory evidence (e.g. the 'fixation' error, also described above) and “selectivity” i.e. a focus on only those factors which support the current world view (e.g. the Air-Inter pilots’ concentration on the VS/FPA entry dial to the exclusion of the underlying entry mode) were common occurrences in failures caused by a lack of operator knowledge (Reason 1990). Neither was common in examples of skill-level failure i.e. the failure to execute a well-rehearsed procedure (as would be implied by an explanatory account based on mode error alone). In this context, mode errors become a symptom of the underlying problem (the operators lack of knowledge of the current system state), rather than the cause. (page 8)
    Issue: mode awareness may be lacking (Issue #95) See Issue details
    Strength: +1
    Aircraft: A320
    Equipment: automation

  7.  
  8. Evidence Type: Excerpt from Accident Review Study
    Evidence: If we combine this understanding of the FCU with the events of flight F-GGED, we can identify two pivotal human 'failures' in the causal chain leading to the accident. Firstly, the pilot entered a seemingly correct parameter value on the correct entry dial, whilst the panel was in an unappreciated mode. Subsequently, both the pilot and co-pilot failed to notice the (unintended) rapid descent of the aircraft until shortly before impact. In other words, they were surprised by the performance of the system (plane) – an example of the phenomenon described in section 1. These errors are summarised in the schematic outline of events in Figure 2. (page 4)
    Issue: automation behavior may be unexpected and unexplained (Issue #108) See Issue details
    Strength: +1
    Aircraft: A320
    Equipment:

  9.  
  10. Evidence Type: Excerpt from Accident Review Study
    Evidence: If we combine this understanding of the FCU with the events of flight F-GGED, we can identify two pivotal human 'failures' in the causal chain leading to the accident. Firstly, the pilot entered a seemingly correct parameter value on the correct entry dial, whilst the panel was in an unappreciated mode. Subsequently, both the pilot and co-pilot failed to notice the (unintended) rapid descent of the aircraft until shortly before impact. In other words, they were surprised by the performance of the system (plane) – an example of the phenomenon described in section 1. These errors are summarised in the schematic outline of events in Figure 2. (page 4)
    Issue: mode transitions may be uncommanded (Issue #44) See Issue details
    Strength: +1
    Aircraft: A320
    Equipment: autoflight FCU
Flight Deck Automation Issues Website  
© 1997-2013 Research Integrations, Inc.