Comparison of two point-of-gaze measurement systems on a dynamic ATC simulation
H. David1, R. Mollard2, P. Cabon2 and B. Farbos2
1Eurocontrol
Experimental Centre, Bretigny-sur-Orge, France
2Laboratoire d'Anthropologie Appliquée, Paris, France
In a study reported in a companion paper, eight Air Traffic Controllers carried out simulation exercises using the TRACON II Autonomous Air Traffic Control (ATC) simulator at the Eurocontrol Experimental Centre (EEC), Bretigny, France, while wearing a SensoMotoric Instruments iView head-mounted eye-tracking device and electrophysiological recording equipment. Traffic density (high/low) and control mode (graphic/keyboard) were varied. An opportunity was found to employ an ASL Model 504 table mounted eye-tracking system in the same experimental context. Four suitable controllers carried out exercises using the graphical interface. Two controllers, who had participated in the previous study, carried out exercises using heavy traffic loading. The other two controllers carried out one light and one heavily loaded exercise using the graphical interface, matched to two controllers in the previous experiment. All other measures were taken as before. The Observer (Noldus Information Technology) was employed on-line to record significant events.
The iView system required the controller to wear a lightweight bicycle helmet, firmly but not painfully strapped to the head. EEG electrodes could be worn under the helmet. The ASL system used a remote, motorised camera surrounded by a ring of infrared emitters, and focussed on the controller's head. To assist in this process, a Polhemus head-position recorder was attached to the controller's head by a headband, and a signal transmitted to allow the camera to acquire the correct eye. No significant discomfort and no electrical interference were observed with either system.
The ASL system allowed the rectangular CRT display to be divided into specific zones, corresponding to the simulated strips, radar and communications zones. The system could identify fixations and record their position and duration. From these figures, it was possible to construct transition diagrams and frequency and duration charts of the controller's gaze. Selected periods of eye movement, such as those preceding an 'error' could be examined on command.
Similar data from the iView system could only be obtained by observation at reduced speed (20% of real time) and transcription using The Observer. (Previous studies suggested that manual analysis of similar recordings, without the aid of The Observer, had a speed of about 5% of real-time; see David, 1985) The iView system, however, can accommodate a far greater range of movement by the controller, who can stand up, turn round and carry out all reasonable movements.
Neither system was seriously affected by the wearing of spectacles by the controller. (No controllers wore contact lenses.) The iView system experienced some difficulties in calibration when the controller was looking downwards, and the ASL system appeared to have difficulty where the controller wore progressive lenses, although it coped correctly with bifocal lenses.
The detailed analyses made possible by the ASL system software suggest that the simple hypothesis (that ATC errors occurred because controllers did not see the developing problem) is unlikely to be true. At least three causes of error can be identified, which relate to the nature of the error.
This was an initial feasibility study, which should be repeated with larger numbers of subjects. The observed results can only be regarded as tentative, but are indicative.
Poster presented at Measuring Behavior 2000, 3rd International Conference on Methods and Techniques in Behavioral Research, 15-18 August 2000, Nijmegen, The Netherlands
© 2000 Noldus Information Technology b.v.