FDAI logo   ::  Site Map  ::   
Home  |  About This Website  |  Contact Us
Home » ... » Evidence for an Issue

Evidence for an Issue 28 pieces of evidence for this issue.

data entry and programming may be difficult and time consuming (Issue #112) - Procedures for data entry and programming automation may be unclear, overly difficult, complex, and time consuming. This may cause errors and delays that may lead to unsafe conditions.

  1.  
  2. Evidence Type: Excerpt from Experiment
    Evidence: "Other tasks during the navigation segment in which the voice and manual interfaces were compared included: 1) correcting system malfunctions, and 2) changing radio frequency channels to place radio calls. For changing radio frequencies—a task similar in nature to rerouting in that a string of alphanumerics was entered into the simulator, the voice interface improved pilots’ speed and accuracy to accomplish the task compared to the manual interface (F [1,11] = 7.63, p < .05). However, the improvement was less than what the voice interface provided for the mission rerouting task. For one system malfunction—flight control pitch fault—voice and manual interfaces demonstrated no differences for accomplishing the pitch fault reset task; for another malfunction—Global Positioning System (GPS) failure—the voice interface increased the time to complete the GPS failure task compared to the manual interface. Possible reasons for these results will be explained in the discussion section."
    Strength: +3
    Aircraft: unspecified
    Equipment: automation
    Source: Barbato, G. (1999). Lessons learned: Integrating voice recognition and automation target cueing symbology for fighter attack. In R.S. Jensen, B. Cox, J.D. Callister, & R. Lavis (Eds.), Proceedings of the 10th International Symposium on Aviation Psychology, 203-207. Columbus, OH: The Ohio State University. See Resource details

  3.  
  4. Evidence Type: Excerpt from Experiment
    Evidence: "The results verified the hypotheses and showed that the pilots were able to designate targets more quickly using voice control coupled with the ATC than with manual control coupled with the ATC (F [1,11] = 4.79, p < .05). Further, pilots reported a decrease in workload when using the voice versus manual interface in combination with the ATC (F [1,11] = 4.73, p < .05)."
    Strength: -1
    Aircraft: unspecified
    Equipment: automation
    Source: Barbato, G. (1999). Lessons learned: Integrating voice recognition and automation target cueing symbology for fighter attack. In R.S. Jensen, B. Cox, J.D. Callister, & R. Lavis (Eds.), Proceedings of the 10th International Symposium on Aviation Psychology, 203-207. Columbus, OH: The Ohio State University. See Resource details

  5.  
  6. Evidence Type: Excerpt from Experiment
    Evidence: "Other tasks during the navigation segment in which the voice and manual interfaces were compared included: 1) correcting system malfunctions, and 2) changing radio frequency channels to place radio calls. For changing radio frequencies—a task similar in nature to rerouting in that a string of alphanumerics was entered into the simulator, the voice interface improved pilots’ speed and accuracy to accomplish the task compared to the manual interface (F [1,11] = 7.63, p < .05). However, the improvement was less than what the voice interface provided for the mission rerouting task. For one system malfunction—flight control pitch fault—voice and manual interfaces demonstrated no differences for accomplishing the pitch fault reset task; for another malfunction—Global Positioning System (GPS) failure—the voice interface increased the time to complete the GPS failure task compared to the manual interface. Possible reasons for these results will be explained in the discussion section."
    Strength: -3
    Aircraft: unspecified
    Equipment: automation
    Source: Barbato, G. (1999). Lessons learned: Integrating voice recognition and automation target cueing symbology for fighter attack. In R.S. Jensen, B. Cox, J.D. Callister, & R. Lavis (Eds.), Proceedings of the 10th International Symposium on Aviation Psychology, 203-207. Columbus, OH: The Ohio State University. See Resource details

  7.  
  8. Evidence Type: Excerpt from Experiment
    Evidence: "Navigation Results. For the mission rerouting task, the main effect of Control Type was significant (F [1,11] = 6.77, p < .05). By using the voice interface, pilots completed the rerouting task almost 14 seconds quicker that when using the manual interface."
    Strength: -3
    Aircraft: unspecified
    Equipment: automation
    Source: Barbato, G. (1999). Lessons learned: Integrating voice recognition and automation target cueing symbology for fighter attack. In R.S. Jensen, B. Cox, J.D. Callister, & R. Lavis (Eds.), Proceedings of the 10th International Symposium on Aviation Psychology, 203-207. Columbus, OH: The Ohio State University. See Resource details

  9.  
  10. Evidence Type: Excerpt from Experiment
    Evidence: "Summary. Navigation segment results showed that voice recognition significantly improved the speed and accuracy of pilot data input during re-route when compared to manual input. Weapon delivery segment results showed that pilot performance was significantly improved by integrating auto-target cueing features with voice recognition when compared to the manual, throttle-mounted switches. In fact, in all but one of the voice interface cases, pilots were able to correctly designate all six of the tanker aircraft in a single pass on the airfield at significantly greater distances from the airfield than when they used the manual interface."
    Strength: -3
    Aircraft: unspecified
    Equipment: automation
    Source: Barbato, G. (1999). Lessons learned: Integrating voice recognition and automation target cueing symbology for fighter attack. In R.S. Jensen, B. Cox, J.D. Callister, & R. Lavis (Eds.), Proceedings of the 10th International Symposium on Aviation Psychology, 203-207. Columbus, OH: The Ohio State University. See Resource details

  11.  
  12. Evidence Type: Excerpt from Survey
    Evidence: Likewise, input procedures and dialogue structures have to follow the logic of the pilot’s task. Again, pilots reported problems with programming instructions to the FMS: P2: “You have to make sure that the departure clearance is linked up, it would just be flashing and say ‘no link’, so you may have to delete it and then it becomes a link. (page 4)
    Strength: +1
    Aircraft: unspecified
    Equipment: automation & FMS
    Source: Bruseberg, A., & Johnson, P. (2004). Should Computers Function as Collaborators?. In Proceedings of HCI-Aero 2004 held in Toulouse, France September 29, 2004 to 1 October 2004. See Resource details

  13.  
  14. Evidence Type: Excerpt from Survey
    Evidence: Moreover, the functionality of being able to pre-program the FMS to carry out complex automated tasks requires pilots to convey these plans and instructions to the system. Since more complex instructions can be communicated, the communication becomes more complex, too. Rudisill [10] reports that pilots often express problems with entering instructions through the keypad into the FMS, particularly when under time pressure. Likewise, pilots have mentioned this issue to us as a particular problem: P1: “...during the high workload phases, operating the FMS, especially the tasks that you don’t do very often ... you might forget to put a slash or a stroke, whatever the format should be that you are typing into the scratchpad ... that is very distracting, getting the format correct. (page 4)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS keyboard
    Source: Bruseberg, A., & Johnson, P. (2004). Should Computers Function as Collaborators?. In Proceedings of HCI-Aero 2004 held in Toulouse, France September 29, 2004 to 1 October 2004. See Resource details

  15.  
  16. Evidence Type: Excerpt from Observational Study
    Evidence: A widely reported problem in modern aircraft is entering instructions through the keypad into the FMS, when under time pressure (e.g. [12]). Pilots have mentioned this issue to us as a particular problem: “...during the high workload phases, operating the FMS, especially the tasks that you don’t do very often, and therefore you might forget to put a slash or a stroke, whatever the format should be that you are typing into the scratchpad, that is very distracting; getting the format correct, especially the format that you don’t often use”. (page 5)
    Strength: +1
    Aircraft: unspecified
    Equipment: automation & FMS
    Source: Bruseberg, A., & Johnson, P. (2004). Considering temporal aspects for the design of humancomputer collaboration: identifying suitable foci. Department of Computer Science, University of Bath. Available at http://www.cs.bath.ac.uk/~anneb/chi%20time%20ws%202004.pdf. See Resource details

  17.  
  18. Evidence Type: Excerpt from Incident Study
    Evidence: "Many of the ASRS reports included the complaint that the FMC/CDU is difficult and time-consuming to program." (page 4.10)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS CDU
    Source: Eldredge, D., Mangold, S., & Dodd, R.S. (1992). A Review and Discussion of Flight Management System Incidents Reported to the Aviation Safety Reporting System. Final Report DOT/FAA/RD-92/2. Washington, DC: U.S. Department of Transportation, Federal Aviation Administration. See Resource details

  19.  
  20. Evidence Type: Excerpt from Incident Study
    Evidence: "The data presented in Table 4-1 suggest the same underlying problem: The crew fails to operate the FMS properly and, at the same time, fails to catch the error before an incident occurs." Table 4-1 shows the breakdown of the number of citations for various Flight Crew FMS Actions/Errors. Out of a total of 99 citations, 3 citations [3%] were logic errors which "usually involved the flight crew entering data in a format or form that the FMC would not recognize, or the pilot not understand the underlying limitations of the system when he or she tried to enter the data." (page 4.1-4.2)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS: CDU
    Source: Eldredge, D., Mangold, S., & Dodd, R.S. (1992). A Review and Discussion of Flight Management System Incidents Reported to the Aviation Safety Reporting System. Final Report DOT/FAA/RD-92/2. Washington, DC: U.S. Department of Transportation, Federal Aviation Administration. See Resource details

  21.  
  22. Evidence Type: Excerpt from Incident Study
    Evidence: "A small subset of tasks which are being performed either just before or during the occurrence of an incident appear repeatedly in the ASRS reports reviewed. This suggests that some tasks performed by means of the FMC/CDU are more difficult than others." ... ASRS reports were included to provide examples of each of these tasks ... "Developing and Entering a Crossing Restriction at a Distance From a Fix Along a Radial (126707) 'Cleared to cross 80 miles south of RIC VOR at FL270. We were leveled at FL330. The aircraft has been adapted with a new FMC. This particular restriction was difficult to get accepted into the FMC. It continuously showed down the scratch pad (invalid entry). Nevertheless, the procedure for the entry was correct. ATC called and queried us about it and we initiated the descent with idle power and full speed brakes and 330 knots. ATC asked if we were going to make it. We (I) acknowledge with an "affirmative" and continued with the steep descent. As I was doing so, the winds were showing higher than usual on the FMC Progress page. Upon realizing that the restriction was not going to be met, just when we were going to advise ATC and request vectors so as to meet the crossing restriction, DCA ATC informed us not to make a steep descent because there was not conflicting traffic involved. I understood what he meant by that statement that everything was okay and we did not request vectors, but continued the descent, crossing 80 DME about 1,000 feet high.' Entering a crossing restriction at a distance from a fix is one of the most common types of clearances received. Nonetheless, pilots do appear to have trouble implementing this clearance, as is shown in this example. What is especially interesting about the example is the response of the FMC to the pilot's entered data. When the entered data do not meet the requirements of the FMC, the only feedback received is 'Invalid Entry.' No clues are provided as to the nature of the problem. One would expect that this lack of informative feedback can only contribute to the programmer's frustration. This example also demonstrates a second common occurrence: The programmer's conviction that what he/she programmed in was correct. This conviction is common to many of the ASRS reports ..." (page 4.17-4.18)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS
    Source: Eldredge, D., Mangold, S., & Dodd, R.S. (1992). A Review and Discussion of Flight Management System Incidents Reported to the Aviation Safety Reporting System. Final Report DOT/FAA/RD-92/2. Washington, DC: U.S. Department of Transportation, Federal Aviation Administration. See Resource details

  23.  
  24. Evidence Type: Excerpt from Survey
    Evidence: 17 of the 30 (57%) respondents reported a 4 (= agree) or 5 (= strongly agree) with pc96 data input prompts may be poor
    Strength: +3
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  25.  
  26. Evidence Type: Excerpt from Survey
    Evidence: 21 of the 30 (70%) respondents reported a 4 (= agree) or 5 (= strongly agree) with pc112 programming may be difficult
    Strength: +3
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  27.  
  28. Evidence Type: Excerpt from Survey
    Evidence: 15 of the 30 (50%) respondents reported a 4 (= agree) or 5 (= strongly agree) with pc94 data entry format may be inflexible
    Strength: +3
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  29.  
  30. Evidence Type: Excerpt from Survey
    Evidence: 3 of the 30 (10%) respondents reported a 1 (=strongly disagree) or a 2 (=disagree) with pc94 data entry format may be inflexible
    Strength: -1
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  31.  
  32. Evidence Type: Excerpt from Survey
    Evidence: 3 of the 30 (10%) respondents reported a 1 (=strongly disagree) or a 2 (=disagree) with pc112 programming may be difficult
    Strength: -1
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  33.  
  34. Evidence Type: Excerpt from Survey
    Evidence: 1 of the 30 (3%) respondents reported a 1 (=strongly disagree) or a 2 (=disagree) with pc96 data input prompts may be poor
    Strength: -1
    Aircraft: unspecified
    Equipment: automation
    Source: Lyall, E., Niemczyk, M. & Lyall, R. (1996). Evidence for flightdeck automation problems: A survey of experts. See Resource details

  35.  
  36. Evidence Type: Excerpt from Survey
    Evidence: "The captain’s continued efforts to locate Tulua through the FMS, in addition to being ineffective, also precluded his using the little available time to employ an alternative navigation method and limited his ability to systematically analyze the nature of the difficulty as well. Because he did not understand the difficulty, he was unable to estimate the time and effort needed to rectify it. In fact, his efforts appeared to be not so much problem solving as rote repetition of keyboard interactions. Given the time pressure it would have taken extraordinary effort to carry out real problem solving." (page 197)
    Strength: +1
    Aircraft: Boeing 757
    Equipment: automation & FMS
    Source: Noyes, J.M. & Starr, A.F. (2000). Civil aircraft warning systems: Future directions in information management and presentation. International Journal of Aviation Psychology, 10(2), 169-188. Lawrence Erlbaum Associates. See Resource details

  37.  
  38. Evidence Type: Excerpt from Survey
    Evidence: "Likewise. it can also be assumed that they would have recognized, as demonstrated in the Kansas City accident, that last minute changes in the approach presented some risk as well. In fact, the first officer’s response to the captain’s question about accepting the offer to execute the approach indicated his concern, “Yeah. We’ll have to scramble to get down. We can do it.” In fact, the airplane had been in a descent before he made this comment, and the descent continued to the accident." (page 196)
    Strength: +1
    Aircraft: Boeing 757
    Equipment: FMS & ATC
    Source: Noyes, J.M. & Starr, A.F. (2000). Civil aircraft warning systems: Future directions in information management and presentation. International Journal of Aviation Psychology, 10(2), 169-188. Lawrence Erlbaum Associates. See Resource details

  39.  
  40. Evidence Type: Excerpt from Incident Study
    Evidence: In our review of 282 automation-related ASRS incident reports, we found 4 reports (1%) supporting issue112 (data entry and programming may be difficult and time consuming).
    Strength: +1
    Aircraft: various
    Equipment: automation
    Source: Owen, G. & Funk, K. (1997). Flight Deck Automation Issues: Incident Report Analysis. http://www.flightdeckautomation.com/incidentstudy/incident-analysis.aspx. Corvallis, OR: Oregon State University, Department of Industrial and Manufacturing Engineering. See Resource details

  41.  
  42. Evidence Type: Excerpt from Survey
    Evidence: Like the AH-64A pilots, many AH-64D pilots requested a moving map. Other comments also noted that the MFDs tend to make the pilot focus inside the aircraft and that the paging system often required too many button pushes. Representative comments of the AH-64D pilots were: … Too many menus/screen. Actions that used to take only the push of a button now take longer since we are forced to navigate through multiple "pages." (page 13)
    Strength: +1
    Aircraft: AH-64D
    Equipment: automation
    Source: Rash, C.E., Adam, G.E., LeDuc, P.A., & Francis, G. (May 6-8, 2003). Pilot Attitudes on Glass and Traditional Cockpits in the U.S. Army's AH-64 Apache Helicopter. Presented at the American Helicopter Society 59th Annual Forum, Phoenix, AZ. American Helicopter Society International, Inc. See Resource details

  43.  
  44. Evidence Type: Excerpt from Experiment
    Evidence: "There were slightly more (61 percent) advanced cockpit (EFIS and/or NAV control) than traditional cockpit aircraft in the data set...It was expected that advanced cock-pit aircraft would be more likely to be involved in crossing restriction alti-tude deviations due to the greater complexity in programming descents and descent crossing fixes. While we did see this pattern, the difference in numbers between advanced and tradi-tional cockpit aircraft was not large." (page 13)
    Strength: +3
    Aircraft: various
    Equipment: FMS & ATC
    Source: Riley, V., Lyall, E., & Wiener, E. (1993). Analytic Methods for Flight-Deck Automation Design and Evaluation, Phase Two Report: Pilot Use of Automation. FAA Contract Number DTFA01-91-C-0039. See Resource details

  45.  
  46. Evidence Type: Excerpt from Survey
    Evidence: "[Pilots] report that interactions with the FMS, in particular, are very complex. Pilots find FMS programming to be time-consuming and that the automation cannot deal adequately with ATC changes." (page 6)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS & ATC
    Source: Rudisill, M. (1995). Line Pilots' Attitudes About and Experience With Flight Deck Automation: Results of an International Survey and Proposed Guidelines. In R.S. Jensen, & L.A. Rakovan (Eds.), Proceedings of the 8th International Symposium on Aviation Psychology, Columbus, Ohio, April 24-27, 1995, 288-293. Columbus, OH: The Ohio State University. See Resource details

  47.  
  48. Evidence Type: Excerpt from Multi-Method Study
    Evidence: "The results of the above described research activities [self-reports, training observations, and experimental study] complement each other. They provide input for the improvement of existing autoflight systems and for the design of future cockpit automation. They also indicate which aspects of FMS-related knowledge should be further emphasized in training. The majority of observed problems with the FMS are related to factors which are known to affect human-computer interactions in a variety of domains: ... E. Pilot -System Communication Current FMS operations do not support pilots in communicating their intentions to the system. ... In case of inadequate input to the system, the message 'Invalid Entry' is displayed without providing any clue as to why the input is not acceptable." (page 1310)
    Strength: +1
    Aircraft: B737-300
    Equipment: FMS
    Source: Sarter, N.B. (1991). The Flight Management System - pilots' interaction with cockpit automation. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, 1307-1310. New York: IEEE. See Resource details

  49.  
  50. Evidence Type: Excerpt from Survey
    Evidence: "Some found that in rare instances it was hard to get quickly by specific data such as runway-, route-, or way-point-changes or return to departure." (page 11.11)
    Strength: +1
    Aircraft: A310
    Equipment: FMS
    Source: Speyer, J.J., Monteil, C., Blomberg, R.D., & Fouillot, J.P. (1990). Impact of New Technology on Operational Interface: From Design Aims to Flight Evaluation and Measurement. Advisory Group for Aerospace Research and Development No. 301, Vol. 1. See Resource details

  51.  
  52. Evidence Type: Excerpt from Survey
    Evidence: "... many, perhaps most, of the crews reported that in times of heavy workload, they tended to 'click it off', that is, revert to manual modes of flight guidance because they did not have time to do the programming necessary to exploit the automation." (page 170)
    Strength: +3
    Aircraft: B757
    Equipment: FMS
    Source: Wiener, E.L. (1989). Human Factors of Advanced Technology ("Glass Cockpit") Transport Aircraft. NASA Contractor Report 177528. Moffett Field, CA: NASA Ames Research Center. See Resource details

  53.  
  54. Evidence Type: Excerpt from Observational Study
    Evidence: "After departing from SJC and completing the first part of the LOUPE departure (which in itself was at that time a tangled procedure creating a workload problem in any aircraft; it has since been somewhat simplified), the following clearance was issued: 'After Wilson Creek, direct 37 degrees 45 minutes north, 111 degrees 05 minutes west, direct Farmington, as filed.' ... When the crew attempted to create the waypoint by entering the coordinates (latitude, lat; and longitude, lon) into the Legs page of the CDU, they experienced considerable trouble due to the fact that the sequence of the clearance did not conform to the format required by the CDU. For example the clearance as transmitted places the hemisphere ('N' and 'W') after the coordinates; the CDU demands that it come first. The crew tried one format after another, with growing frustration. Both were 'heads down' in the cockpit for a considerable time trying various formats for data entry. At one point the crew's input of the coordinates had five errors of three different types. Finally, the captain arrived at a solution: he told the first officer to fly the plane while he searched through other pages in the CDU, hoping to find the correct format for a lat and lon waypoint to use as a model. His solution represented true 'resource management.' Information readily at hand, several CDU pages containing lat/lon formats for another purpose, was used to solve the problem. In brief, the unexpected and unfamiliar lat and lon waypoint created a high workload and a compelling demand for effective crew coordination. Just why the controller felt the need to issue a lat and lon waypoint, when he could have given bearing and distance off of a nearby VOR (which is easy to enter into the CDU), is not clear. In issuing such a complex clearance, the controller was not only burdening the crew but was also making trouble for himself." (page 223-224)
    Strength: +1
    Aircraft: unspecified
    Equipment: FMS
    Source: Wiener, E.L. (1993). Crew coordination and training in the advanced technology cockpit. In Wiener, E.L. , Kanki, B.G., & Helmreich, R.L. (Eds.), Cockpit resource management, 199-229. San Diego, CA: Academic Press. See Resource details

  55.  
  56. Evidence Type: Excerpt from Survey
    Evidence: "... [a] problem leading to confusion and hence increased workload is that fact that the computer-produced flight plan provided to the crew may contain waypoints whose names are inconsistent with those in the FMC. This is particularly true of waypoints located at non-directional beacons (NDBs). An example is Carolina Beach. The computer-produced flight plan (KMIA to northeast airports) reads '...AR3 CLB...' and the crew quite naturally attempts to load 'CLB' into the FMC, only to receive 'not in database' error messages. The FMC stores this waypoint as 'CLBNB', which is on neither the flight plan nor the chart. ... The author has several times seen crews puzzling over their inability to load such a waypoint before discovering from their charts that the waypoint is an NDB, and recalling that the 'NB' must be added. It would seem a small matter to program the computers that furnish the flight plans to be consistent with the FMC designators, and it would also probably aid crews of conventional aircraft. Such inconsistencies generate increased workload and frustration, often leading to abandonment of the automation, and what is worse, they harbor the potential for serious error." (page 180)
    Strength: +1
    Aircraft: B757
    Equipment: FMS
    Source: Wiener, E.L. (1989). Human Factors of Advanced Technology ("Glass Cockpit") Transport Aircraft. NASA Contractor Report 177528. Moffett Field, CA: NASA Ames Research Center. See Resource details
Flight Deck Automation Issues Website  
© 1997-2013 Research Integrations, Inc.