Using Eye-Tracking for Adaptive Human-Machine Interfaces for Pilots: A Literature Review and Sample Cases

This paper explores the potential of eye-tracking technology in adaptive human-machine interfaces for pilots in aviation. We argue that an interface able to adjust its layout and elements based on pilots’ real-time eye-tracking data can prevent errors and enhance their performance. The study presents a literature review on the use of eye-tracking for various pilot cases, including flight simulator games, drone pilots, and cockpit pilots. Results in most cases showed that eye-tracking has been employed to improve interactions, enhance spatial awareness, guide pilots’ gaze to relevant areas, and provide insights into pilots’ information processing and task load. The paper discusses two sample cases demonstrating the potential of eye-tracking in adaptive human-machine interfaces. In the first case, during challenging drone simulations, eye-tracking identified areas where an adaptive human-machine interface could aid navigation and reduce cognitive load. In the second one, based on real drone flights, when signal loss incidents occurred, eye-tracking data showed that the interface should adapt to pilots’ needs by providing critical information to help them to improve situational awareness. The paper concludes that eye-tracking technology has significant potential in adaptive human-machine interfaces for aviation, emphasising the importance of refining these technologies to meet pilots’ specific needs and enhance flight safety.


Introduction
The need for an adaptive interface that changes its layout and elements according to the needs of the pilots is critical in such a demanding human-machine interface (HMI), where a lot of information is processed rapidly by the pilots.We argue that the use of eye-tracking technologies can assist HMI in adapting to prevent human errors such as failing to see important information, misinterpreting data, or failing to act.In this paper, we present a literature review of the use of eye-tracking in adaptive HMI for various cases of pilots (flight simulator games, drone pilots, cockpit pilots).In the context of flight simulator games, eye-tracking has been used as an input mechanism enabling gaze-based interactions or enhancing basic interaction techniques, to improve spatial awareness and to guide players to relevant areas of the cockpit, as well as a player behaviour analysis tool.Eye-tracking has also been applied to Unmanned Aerial Vehicle (UAV) pilots to improve control, efficiency, and accuracy of drone operation, and to monitor pilot performance.Finally, in aviation, eye-tracking data can provide insights into pilots' information processing, task load, visual behaviour, and performance in the cockpit.
As a proof of concept, we discuss two sample cases demonstrating the potential of using eyetracking technology in adaptive HMI for pilots.During demanding UAV flight scenarios that took place on a UAV simulator, pilots were observed navigating through obstacles and maintaining low altitudes.Analysis of their gaze fixation points revealed areas where an adaptive HMI would alleviate the cognitive burden and enhance spatial orientation.Also, the analysis of data from actual flights using a UAV in cases of signal loss, highlighted the need for an adaptive HMI.The paper discusses the analysis of the pilots' gaze points and the improvements suggested by them after a retrospective analysis of the eye-tracking data.

Flight Simulator Games Pilots
Eye-tracking has found applications in the field of video games in various ways.It has been utilized as an input mechanism for controlling games, as a tool for examining player behaviour after gameplay, and as a supplementary method to enhance existing interaction techniques [1].Specifically, eyetracking technology has been applied in first-person shooter games to assist players in maintaining precise aim even when the target is in motion [2], as well as in the concept design of the first flight simulators to adjust the player's visual perspective [3].Modern eye tracking technology is incorporated in various games [4], including the MS Flight Simulator, where gaze-based interactions enable the pilot to fly the plane, keeping their hands in the controls while providing a comprehensive view of the in-game camera and improved spatial awareness.Additionally, using eye-tracking data, players could be guided to relevant areas of the cockpit [5].
In the realm of eye tracking in gaming, [6] have adopted a unique perspective on utilizing gaze technology as they devised a method leveraging eye-tracking technology to estimate cognitive load and promptly visualize its parameters.Their method can potentially enhance safety for pilots and passengers by promptly alerting ground controllers or even automatically activating the autopilot in demanding scenarios.Navarro and Sundstedt [7] used eye gaze for simplification of game mechanics, where gaze was used for some of the game controls in a space shooting video game.The results showed a positive experience, with participants reporting that it provided a mechanical simplification resulting in this mode of interaction being the preferred choice among the participants compared to conventional controls.In contrast, the opposite findings were reported in another study [8] utilizing the game Lunar Command where the players had to shoot down missiles that were moving toward a city by aiming them with a weapon that had a latency of around 1 second until the projectile reached the target.The authors compared two versions of the game, one with mouse-based and one with eyetracking aiming, and the results showed that the users "appeared to have a difficult time making themselves 'lead' the missiles by looking out into empty space in front of the moving target" indicating that there are some cases where simple eye tracking integration as an input might have unfavourable results.
Finally, an example of an adaptive video game [9] that uses eye tracking as an input presented a space shooting game where the targets were gaze sensitive, meaning that if the pilot hadn't detected them when they were within shooting range, the targets would visually adapt with a growing animation as a stimulus for the pilot to notice them.

Unmanned Aerial Vehicle (UAV/Drone) Pilots
A typical use of eye tracking for UAVs is to evaluate the ergonomics in UAV control stations [10].Specifically, [11] developed a system to analyse content visibility, accessibility, and other characteristics by tracking the operator's actions, capturing eye positions, and collecting ergonomic data.The collected data would be used to evaluate the arrangement of seats, OOP design, explicit interface layout, operator workload, and fatigue levels.By considering eye movement as a variable, a thorough assessment of the human-machine interface could be conducted.
The use of gaze as an input method has been explored as a supporting mechanism for camera control in hands-busy situations for activities that require teleoperation [12], with works for UAV pilots focusing on using eye-tracking technology to provide better control of the drone for people in need [13] or for eye-gaze-guided cameras [14].Drone gaze control that focuses on other aspects of navigation other than the camera has been evaluated by [15], which compared four different mappings of gaze control and found that the participants were able to control the drone using their gaze, regardless of the control mode, after a short period of practice with a low error rate (only two crashes out of 40 trials).One specific control mode involved controlling rotation and speed with gaze while using the keyboard for translation and altitude, which was deemed significantly more reliable than the other modes.Additionally, eye-tracking could be used to monitor the performance of drone pilots and detect abnormal statuses even in cases of multi-UAV operators [16].Following these works, the use of eye-tracking data could initiate changes in the HMI for UAV pilots to help overcome difficult situations, adapt to new challenges, and focus on data they have failed to monitor in time.

Use of Eye Tracking in Aviation
Data from eye-tracking have the potential to provide direct measures of pilots' information processing in the cockpit, including the information that is sampled by the operator over a given period (e.g., the distribution of fixation locations), and the time that it takes to process this information (e.g., fixation durations) [17].The use of a standalone eye tracker in a light aircraft has been explored, and results showed that despite some limitations, the eye tracker provided valuable behavioural and physiological data, highlighting changes in pilots' attention and stress levels [18].Eye-tracking equipment has been used to evaluate different colour combinations for fighter jets' Heads-Up Displays (HUD) [19].The study recorded eye movement data and analysed key indicators related to discernibility, perceptibility, and accessibility of HUD information and explored the optimal colour coding scheme for key elements of the HUD against different background brightness levels, with their findings providing insights for improving HUD design and optimising the cognitive performance of pilots in various flight environments, offering guidance for HUD enhancement and reference for future research in this area.Additionally, the analysis of the pilots' gaze distribution is used to measure how the pilots' task load influence visual behaviour and performance [20,21] and, generally, understanding of how the pilot processes the information in the cockpit while carrying out particular tasks [22].A particular case is [23] as it examines pilot eye movements during approach phases, with results indicating similarities in eye behaviour between the pilot flying and the pilot monitoring.However, attentional allocation of the pilot monitoring may not be optimal, particularly during the short final phase.While eye-tracking in aviation has been primarily focused on similar cases as the ones mentioned above, its adoption demonstrates the immense potential for future implementations of adaptive HMI.

Sample Cases
This section presents two sample cases demonstrating the need of using eye-tracking technology for adaptive HMI for pilots.The former took place on a UAV simulator (DJI Flight Simulator) using an eye-tracking screen (Tobii screen eye-tracker) and the latter was based on a UAV (DJI Mavic 2 Pro) using the Tobii Pro Glasses 3 for raw data collection.In both cases the Tobii Pro Lab software was used for data analysis.

Challenging UAV Flight Simulations
In this case, the pilots were tasked with flying under demanding conditions that required precise navigation of the UAV.For instance, the flight scenarios involved manoeuvres under rock formations, passing through wooden stakes beneath a wooden bridge, and maintaining proximity to the sea surface.The pilots utilized the actual UAV controller, and the focus of the interaction analysis was on their ability to handle critical situations, such as flying at low altitudes while avoiding obstacles.By examining the pilots' fixation points, we were able to identify their reactions, discern discrepancies between experienced and novice ones, and determine areas suitable for adaptation from the HMI.
An example highlighting the potential of an adaptive ΗΜΙ to assist less experienced pilots during flight was observed when navigation instructions required the utilization of a compass (e.g., "head north for 300 meters").In this case, pilots had to continuously shift their focus between the compass, located at the bottom left of Figure 1, and the UAV video feed.An adaptive ΗΜΙ could detect this frequent gaze switching and enhance the UAV video feed by overlaying a straight-line path, thus aiding the pilot's navigation by providing a visual aid that would facilitate more accurate navigation.

Figure 1. Heat map of an inexperienced pilot
Another scenario involved pilots using a map (depicted in the bottom right of Figure 1) to follow the instruction "return to home point and land".By comparing the gaze heat maps of an experienced pilot (Figure 1) and a less experienced pilot (Figure 2), it became evident that the experienced pilot primarily focused on the compass and the UAV's forward view.In contrast, the less experienced pilot had to attend to various metrics, such as distance and speed, check the map, and unintentionally neglect the compass.In such situations, an adaptive ΗΜΙ could detect the pilot's struggle and provide visual cues in the forward view to indicate proximity to the home point.By doing so, the adaptive ΗΜΙ would assist the pilot in maintaining a sense of spatial orientation and would alleviate the cognitive burden associated with monitoring multiple information sources simultaneously.

UAV Flight During Signal Loss
In this case study, the pilots were tasked with executing a flight scenario that involved transitioning from visual line of sight (VLOS) to beyond visual line of sight (BVLOS) operations.However, during the BVLOS phase of the flight, they encountered signal interruptions and, in some instances, complete signal loss.Figure 3 shows an aerial photograph of the flight path, which is based on the given case scenario (captured by another UAV flying at a higher altitude than the participating UAV).As per the scenario, the pilots were required to take off from the initial point (indicated as "Pilot" in the photograph of Figure 3) and ascend precisely to an altitude of 10 meters from the take-off point.Subsequently, they were instructed to maintain a straight trajectory for 100 meters while preserving their altitude, before returning near the take-off point.Furthermore, the pilots were directed to fly near a building (which exceeded the UAV's current altitude) and initiate a BVLOS flight until reaching a designated marker.To comply with European Safety Regulations, an observer (as shown in the photograph of Figure 3) was positioned to monitor the UAV when the pilot was engaged in BVLOS flight.The observer held a red flag, and raising it indicated that the pilot encountered difficulties, prompting an experienced pilot to assume control of the UAV.This safety measure is referred to as "observer-initiated cancel."Additionally, the pilots had the option to autonomously terminate the flight if they felt uncomfortable at any stage of the experiment or felt that it is risky for the UAV to continue (this measure was referred as "self-cancel".) Given the BVLOS segment of the scenario, signal interruptions or even complete signal loss were anticipated due to the interference caused by the nearby building, which affected the control-UAV connection.The severity of signal loss incidents primarily depended on the proximity of the flight path to the building; however, such occurrences were expected during each flight.Consequently, the data captured by the Tobii Pro Glasses 3 revealed several instances where the UAV controller screen failed to provide the pilots with the necessary information to enhance their performance.Figure 4 illustrates a part of the case scenario wherein the pilot becomes disoriented and attempts to concentrate on a specific screen region in an unsuccessful search for information while simultaneously directing their attention towards the controls.In such situations, the interface could provide supplementary information by adapting to the pilot's requirements and emphasizing the sought-after data, such as the distance covered by the UAV in this instance.Through the analysis of the collected data and retrospective comments from the pilots, it became evident that an adaptive ΗΜΙ that responds to input from the eye-tracker could be beneficial.An example of such user interface changes, based on the eye tracker data, that the users suggested was to make adaptations based on the user's piloting experience.The gaze analysis showed that inexperienced users spend time focusing on the controls (Figure 4) instead of the screen, which was mentioned as a contributing factor to the sense of accuracy during navigation but could have negative consequences, such as missing important visual information from the screen.In such a case, an adaptation could be to visualize the control's movement on the screen to minimize eye gaze beyond the screen area.
The most frequently suggested improvements by the pilots, following the retrospective analysis of gaze data, were as follows: "…when signal loss occurs, instead of displaying a black screen, the interface could promptly switch to a full-screen view of the map, indicating the last known strong signal point.Once this point is reached, the interface could then revert to the camera view...", or "…rather than automatically triggering the return-to-home option, the interface could offer the alternative of returning to the location of the last strong signal…" Another suggestion was "…instead of displaying a black screen, the interface could present the last captured frame from the camera, enabling the pilot to determine their precise location at the time of signal loss…" Finally, many pilots mentioned that since the pilot typically searches for the UAV by gazing in its direction when the signal is lost, "…the interface could detect when the screen is not providing any relevant information and promptly switch to the map view, accompanied by an auditory cue to redirect the pilot's attention back to the screen…"

Conclusion and Future Work
The utilization of eye-tracking technology in aviation, as demonstrated in the case studies discussed in this paper, aligns with previous research findings in related domains.Literature review reveals various applications of eye-tracking technology, including its integration in flight simulator games and the control of UAVs.Furthermore, eye-tracking has been employed to monitor pilot performance, detect abnormalities, and enhance human-machine interaction in aviation settings.In the realm of flight simulator games, eye-tracking technology has been employed to improve players' aiming accuracy, adjust visual perspectives, and enhance spatial awareness.Eye-tracking technology has also found relevance in the field of UAV piloting.It has been utilized to improve drone control for individuals in need and as an auxiliary tool for enhancing the efficiency, accuracy, and cognitive load of UAV operations.Moreover, eye-tracking technology offers direct insights into pilots' information processing within the cockpit.By analysing gaze distribution, researchers can measure operators' fixation locations, durations, and task load, providing valuable insights into visual behaviour and performance.These findings contribute to a deeper understanding of how pilots process information and perform specific tasks in the cockpit.
Based on the eye-tracking data from the two case studies we presented, in challenging UAV flight simulations, an adaptive HMI that detected pilots' gaze switching between the compass and the UAV video feed could significantly aid navigation.Additionally, during real UAV flight with signal loss incidents, the pilots faced disorientation and difficulties in accessing critical information.The analysis of gaze data showed that an adaptive HMI that responds to eye-tracker input could be helpful.Pilots suggested improvements such as displaying a full-screen map view indicating the last known strong signal point, providing the last captured frame from the camera when signal is lost, and automatically switching to the map view when relevant information is not available on the screen.
In conclusion, these case studies underscore the value of incorporating eye-tracking technology and adaptive HMIs in aviation contexts.The findings highlight the potential for adaptive interfaces to assist pilots in critical decision-making, improve situational awareness, and enhance overall flight safety.Future research and development efforts should focus on refining these technologies to meet the specific needs and requirements of pilots in diverse flight scenarios.

Figure 2 .
Figure 2. Heat map of an experienced pilot

Figure 3 .
Figure 3. Part of a typical flight path

Figure 4 .
Figure 4. Gaze plot from a UAV flight