Summary

Evaluating Flight Performance and Eye Movement Patterns Using Virtual Reality Flight Simulator

Published: May 19, 2023
doi:

Summary

A new virtual reality flight simulator was built, which enables efficient and low-cost evaluation of flight performance and eye movement patterns. It also provides a high-potential research tool for ergonomics and other research.

Abstract

Efficient and economical performance evaluation of pilots has become critical to the aviation industry. With the development of virtual reality (VR) and the combination of eye-tracking technology, solutions to meet these needs are becoming a reality. Previous studies have explored VR-based flight simulators, focusing mainly on technology validation and flight training. The current study developed a new VR flight simulator to evaluate pilots’ flight performance based on eye movement and flight indicators in a 3D immersive scene. During the experiment, 46 participants were recruited: 23 professional pilots and 23 college students without flight experience. The experiment results showed significant differences in flight performance between participants with and without flight experience, the former being higher than the latter. In contrast, those with flight experience showed more structured and efficient eye-movement patterns. These results of the differentiation of flight performance demonstrate the validity of the current VR flight simulator as a flight performance assessment method. The different eye-movement patterns with flight experience provide the basis for future flight selection. However, this VR-based flight simulator has shortcomings like motion feedback compared to traditional flight simulators. This flight simulator platform is highly flexible except for the apparent low cost. It can meet the diverse needs of researchers (e.g., measuring situation awareness, VR sickness, and workload by adding relevant scales).

Introduction

The European Aviation Safety Agency (2012) categorizes flight simulators as training facilities, flight and navigation program trainers, flight training equipment, and complete flight simulators1. To date, a range of flight simulators is available for training, from low-level tabletop systems to highly complicated motion-based full flight simulators2. The traditional simulator includes a flight dynamics model, a system simulation, a hardware cockpit, an external visualization, and an optional motion simulation3.

These traditional flight simulators have some advantages as effective flight training equipment. However, their cost is high and environmentally unfriendly, as the drive of each system requires substantial electrical energy, especially a full flight simulator, which requires high temperature and high-pressure fluid or air pressure, consumes much power and generates a lot of noise4.

However, a simple desktop simulator system is flexible and low-cost, with lower immersion and fewer interactions than a full flight simulator2. Therefore, it is essential to develop new flight simulators that combine the advantages of desktop systems and full flight simulators (in other words, the flexibility of a tabletop simulation and the immersion and interaction level close to a full flight simulator).

With the development of computer technology, especially virtual reality (VR) technology, a new type of flight simulator based on emerging VR technology is becoming a reality. The VR-based flight simulator is flexible, portable, low-cost, and has fewer space requirements than conventional flight simulators5. Researchers have created flight simulators based on VR technology over the past 20 years6,7,8,9,10,11; however, these VR flight simulators are mainly for flight training, and there are few for pilot selection. Still, with cost reduction and technology enhancement, VR-based simulators are changing and becoming feasible for personal selection. Some studies have used VR-based simulators for personal selection in different domains: Schijven et al.12 selected surgical trainees using a virtual reality simulator. Huang et al.13 developed a psychology selection instrument based on virtual reality technology for air force pilot recruitment. Wojciechowski and Wojtowicz14 assessed a candidate's capabilities as an unmanned aerial vehicle (UAV) pilot based on VR technology. Given that pilot selection is critical for the aviation industry, it is pressing to develop a new VR-based flight simulator focusing on pilot selection, as large-scale pilot selection is susceptible to the cost of the simulator and demands in the portability simulator system.

Eye movements provide cues for a pilot's performance. Different studies have found that the eye-scanning mode distinguishes the performance between expert and novice pilots. By comparing the scanning pattern between experts and novices, experts' efficient and structural eye movement behavior and the inadequate scanning methods of beginners could be differentiated. Several aviation studies have found that pilots' eye-scanning strategy highly relates to the level of expertise15,16,17,18,19,20,21,22,23,24. According to Bellenkes et al.25, the duration of experts' fixations is shorter, and the frequency of their fixations on instruments is higher than that of novices. Almost the same conclusion was drawn by Kasarskis et al.26, who discovered that expert pilots have more fixations combined with shorter durations than novices, suggested that expert pilots have a better visual mode than novices. In another study, Lorenz et al.27 found that experts spend more time looking outside the cockpit than novices. These results have great practical value in the selection of newcomers.

Flight performance assessment is another critical factor for pilot selection. However, the following problems exist in pilot flight performance evaluation: conflicting expert opinions, more selection norms, and a unified selection theory. In the driving field, Horrey et al.28 compared the absolute value of lane departure from the centerline for different experimental conditions to assess driving performance. Back to the aviation domain, the flight quick access recorder (QAR) records all sorts of pilot manipulation parameters, aircraft parameters, environments, and warning information during flight29. More specifically, as the QAR indicators, the pitch angle is the rotation angle around the left and right axes of the aircraft30, and the reference line (or the center reference line) is right in the middle of the red and green lines28; these two flight parameters are used to evaluate the flight performance of participants with or without experience in the current study. These QAR data can be used to evaluate flight performance, yet to our best knowledge, they have seldom been used for personal training and selection in scientific research31,32.

Measurements of eye movement patterns can be used to assess and predict flight performance and guide pilot training and selection. Gerathewohl33 stated that the eye is the most important sensory organ of the pilot, processing 80% of the flight information. Pilots must acquire visual information from instruments in the cockpit and integrate it into a coherent image to manage the flight22. Further, optimal scanning behavior is essential to accomplish better flight performance15. However, no affordable flight simulator currently integrates an eye tracker to facilitate quantitative studies of the relationship between eye movements and flight performance.

The current study developed a new VR flight simulator to assess if participants with flight experience had better flight performance than those without flight experience. The VR flight simulator integrates eye tracking and a flight dynamics system allowing eye-movement pattern analysis and flight performance evaluation. In particular, it is worth mentioning that the VR flight simulator uses a VR eye tracker34, not a glass-like or desktop eye tracker, to analyze the area of interest (AOI)-based eye movement without time-consuming frame counting.

Finally, the present work can lead to an omnibus measurement for pilot selection in the future, from eye scanning path to objective flying performance data. With the help of the virtual flight simulator, the cost of flight selection will be significantly reduced, and the norm of pilots can be formed based on extensive data gathering. The work fills a gap between conventional and desktop simulators for flight selection needs.

Protocol

All methods described here have been approved by the Institutional Review Board (IRB) of Tsinghua University, and informed consent was obtained from all participants. After completion, all participants were paid $12 (or a gift of equal value).

1. Participant selection

  1. Recruit participants according to a prior study of power analysis using G*Power software35 (see Table of Materials) to ensure the participant number meets the expected sample size given by G*Power, which equals 21.
    NOTE: The expected sample size given by G*Power refers to the estimated number of participants needed in a study to achieve a desired level of statistical power based on the specified effect size, significance level, and statistical test used. The expected sample size is an important consideration in research planning. It helps researchers determine the feasibility and cost-effectiveness of their study design and ensures that the study has sufficient statistical power to detect meaningful effects.
  2. Ensure none participants have a history of epilepsy, heart or brain disease, recent endocrine or psychiatric medication, or severe skin allergies.
  3. Use the Snellen eye chart (use the metric system 6/6)36 to confirm that participants' vision is normal or corrected-to-normal and that they don't have any visual impairments like color blindness or color weakness.
  4. Ensure that none of the participants have consumed alcohol or drugs within the previous 24 h that may affect their ability to fly.
  5. Ensure the participants have had no less than 6 h of sleep and are in good mental condition before the experiment.

2. Flight simulator hardware

  1. Check that all of the flight simulator's hardware is complete. According to its function, this hardware is organized into three modules (Table 1) (see Table of Materials).
    NOTE: Researchers need to touch a metal rod before touching the equipment to avoid the risk posed by electrostatic induction.
    1. Check the components of the VR, HMD (head-mounted display), and eye-tracking module with the help of Table 2.
    2. Ensure that all the flight simulator PC module components meet the following minimum requirements: 3.6G Hz processor, 4G internal memory, 64-bit operating system, and graphics card. The system will support all flight controllers.
    3. Check the components of the flight control module and ensure the device's parameter configuration is consistent with Table 3.
  2. Install the hardware of the flight simulator according to the layout in Figure 1. Figure 2 shows how the hardware is connected.
    1. Join the throttle and control panel physically and treat them as a unit.
    2. Connect the throttle, joystick, and pedal to the flight simulator PC module via USB.
    3. Connect the HMD to the flight simulator PC module via the link box.
    4. Connect the base stations and VR controllers to the HMD via the VR software on the PC.
VR head-mounted display(HMD) and eye tracking module 1. Base Station
2. VR HMD
Flight Simulator PC module 3. Flight Simulator PC
Flight control module 4. Flight Throttle
5. Flight Joystick
6. Flight Pedal

Table 1: Components of the three modules of flight simulator hardware.

Main component Accessories
VR HMD Headset cable (attached)
Face cushion (attached)
Cleaning cloth
Earphone hole cap × 2 
Link box  Power adapter
DisplayPort cable
USB 3.0 cable
Mounting pad
Controllers (2018) × 2 Power adapters × 2 
Lanyards × 2 
Micro-USB cables × 2 
Base Station 2.0 × 2 Power adapters × 2 
Mounting Kit (2 mounts, 4 screws, and 4 wall anchors)

Table 2: List of components of the VR HMD and eye-tracking module.

Device Parameter configuration
Flight Joystick Nineteen action buttons
One 8-way "point of view" hat
Several 3D magnetic sensors
One 5-coil spring system
One 16-bit resolution (65536 x 65536 values).
Flight Control Panel Fifteen action buttons
One TRIM wheel
Five programmable LEDs
Flight Throttle Seventeen action buttons
One mouse hat with a push button
One 8-way "point of view" hat
Several 3D magnetic sensors
Two 14-bit resolution
Flight Pedal Tension between 2.5 kg and 5 kg
Angle between 35° and 75°

Table 3: The parameter configuration of the devices of the flight control module.

Figure 1
Figure 1: The layout of the VR flight simulator hardware. Please click here to view a larger version of this figure.

Figure 2
Figure 2: The connection of flight simulator hardware. (A) Flight control module. The throttle and control panel are physically joined and treated as a unit. If the term "throttle" is used in this study, it refers to both the throttle and the control panel. (B) Flight simulator PC module. A computer that meets the requirements outlined in step 2.2. (C) HMD and eye-tracking module. The software development kits (SDKs) for eye tracking and the 3D engine are kept in sync when installed on the same computer. Therefore, the eye-tracking functions and the operating system interact and work together. Please click here to view a larger version of this figure.

3. Flight simulator software

  1. Ensure all the software (see Table of Materials) has been installed before the experiment begins. Information about all software used in the experiment is shown in Table 4.
Name Description
VR software A widely used tool for experiencing VR content on the hardware.
VR app store The app store for virtual reality where customers can explore, create, connect, and experience the content they love and need.
Eye-tracking software Eye-tracking software developed by the research team via Eye-tracking and 3D engine SDKs.
FlySimulator The main program of the flight simulator software, developed by the research team.
Screen recording software A free and open source software for video recording and live streaming.

Table 4: Information about all software used in the experiment.

4. Preparation before launching the flight simulator

NOTE: If this is one's first time running the eye-tracking program, perform the additional steps according to Figure 3. The eye-tracking program will activate automatically after the initial run.

Figure 3
Figure 3: The additional steps when running the eye-tracking program for the first time. Please click here to view a larger version of this figure.

  1. Ensure that the HMD is turned on and connected to the computer.
  2. Place the base station diagonally to ensure the HMD can always be inside the base station's monitoring range. Keep the base station fixed to keep a stable VR environment.
  3. Set up a standing-only play area according to the prompts given by the VR software on the computer.
    NOTE: The prompt allows a room-scale area or sets up a standing-only play area. Standing-only mode is generally used to experience VR scenes that do not require walking and can be selected when the user has limited space to move around. Therefore, in this experiment, set up a standing-only play area.
  4. Set the eye-tracking calibration. The eye-tracking system must be recalibrated every time the participant is switched.
    1. Confirm that the participants are not using contact lenses, which causes eye tracking to malfunction.
    2. Open the eye-tracking calibration program using the VR controllers (see Table of Materials).
    3. Adjust the device height, interpupillary distance (IDP), and gaze point as the system directs.
    4. Get the participant's eyes to light up each spot for 2 s in a clockwise direction to verify the effectiveness of the eye-tracking calibration.
  5. Connect the tuned flight control module and large screen display (i.e., at least a 27 inch monitor) to the same computer as the VR HMD. The screen allows the experimenter to view what is happening in the VR HMD simultaneously.

5. Experimental procedure

NOTE: The experiment is divided into four steps: "collect information," "introduce the task and operation," "practice before the experiment," and "conduct a formal experiment." The experimental process is summarized in Figure 4.

Figure 4
Figure 4: The flowchart of the experiment. Please click here to view a larger version of this figure.

  1. Collect information from participants (15 min).
    1. Fill out the payment details form with the payment information of the participants.
    2. Inform the participants to read the informed consent document and sign it.
  2. Introduce the task and simulator operation to the participants (5 min).
    1. Explain to the participants the traffic pattern and map (Figure 5).
    2. Instruct participants on how to use the VR flight simulator.
    3. Inform the participants that they can leave the experiment if they experience discomfort while simulating a flight.
  3. Ask the participants to practice using the VR flight simulator (15 min).
    1. Fit the VR HMD to the participants and calibrate the eye movements. Adjust the device height, interpupillary distance (IDP), and gaze point as instructed by the system.
    2. Open the screen recording software.
    3. Launch the FlySimulator program to assist participants with flight training.
    4. Instruct the participants to use the flight pedal, joystick, and throttle in concert to steer the aircraft as closely as possible to the reference line of flight (Figure 5).
    5. Reset the flight throttle and engine switch buttons after exiting the FlySimulator program, and stop recording the screen after the subject has completed the practice.
  4. Conduct formal flight experiments (20 min).
    1. Put on the VR HMD for the participants, and run OBS Studio to start recording the screen.
    2. Start the FlySimulator program, choose a fit perspective, start the engine, let go of the parking brake, and put the flight throttle to full.
    3. Exit the FlySimulator program, then reset the flight throttle and engine switch buttons. Stop recording the screen.

Figure 5
Figure 5: The traffic pattern for the VR flight simulator. The pitch angle is the rotation angle around the left and right axes of the aircraft, and the reference line (or the center reference line) is right in the middle of the red and green lines. Please click here to view a larger version of this figure.

6. Data analysis

  1. Analyze eye movement data.
    1. Divide five area of interest (AOI) zones in this study; for more information about the calculation of AOIs, see Figure 6, based on the actual panel of the flight cockpit corresponding to the flight cockpit instrument (Figure 7).
    2. Calculate the AOI difference test using the percentage of dwell time in the AOI zone37, given the equipment feature of the VR oculomotor used in this study.
      NOTE: The average percentage of dwell time is the cumulative percentage spent looking within an AOI divided by the total fixation time and averaged across participants.
  2. Analyze flight performance data with custom-developed programming scripts using Python 3.10. The core algorithm for performance metrics was adapted from the QAR analysis method, inspired by the 35 BAE-146 aircraft QAR data provided by the National Aeronautics and Space Administration (NASA)38.
    NOTE: Flight performance indicators for this study: the total flight time (the total length of time from takeoff to landing for each participant, in seconds), pitch angle 1 s before landing (pitch angle of the aircraft 1 s before landing, obtained from raw data, in degrees), mean distance to reference line (mean error of the spatial distance between the aircraft and the reference line during flight, in meters), and standard deviation of distance to reference line (standard deviation of the spatial distance between the plane and the reference line during flight, in meters).
  3. Conduct the statistical analyses.
    1. Use the Shapiro-Wilk test39 to confirm the data's normality.
    2. Use statistical software (see Table of Materials) for descriptive statistics, Mann-Whitney U test, and Student's t-test.
    3. Depict the violin plots to indicate better the shape of the data distribution40.
      NOTE: The explanation for the effect size measured using Cohen's d is as follows: 0.1 is very small, 0.2 is small, 0.5 is medium, 0.8 is large, 1.2 is very large, and 2.0 is huge41,42. The significance level was set at p < 0.05.

Figure 6
Figure 6: AOI preprocessing and calculating flow process. Sections 1 to 4 describe how the present study processed the pilots' eye movement data up to the independent-sample t-test. Please click here to view a larger version of this figure.

Figure 7
Figure 7: Schematic diagram of AOI division of flight instrument. The function of instruments: (A) The airspeed indicator indicates the aircraft's speed relative to the air. (B) The altitude indicator shows the aircraft's pitch and roll altitude. (C) The vertical speed indicator indicates the aircraft's ascent or descent speed. (D) The altitude indicator indicates the aircraft's barometric altitude. (E) The engine speed indicator indicates the speed of the aircraft engine. Please click here to view a larger version of this figure.

Representative Results

For the current experiment, 23 experts with flight experience and 23 novices without flight experience were chosen. The participants were between 25 and 58 years of age (experts: M = 32.52 years, SD = 7.28 years; novices: M = 29.57 years, SD = 5.74 years). The gender of all participants was male. All the novices were recruited from Tsinghua University (students or faculty), and all the experts were from China Eastern Airlines.

Eye movement
The eye movement data of the instrument AOIs, airspeed indicator, vertical speed indicator, altitude indicator, and engine speed indicator are not normally distributed, according to the Shapiro-Wilk test (all p-values <0.05 ). Thus, the Mann-Whitney U test was used. The results showed that experts (M = 60.79%, SD = 7.72) paid more time looking at the flying instruments than novices (M = 53.31%, SD = 12.89; U = 141, p < 0.01). More precisely, experts looked more at the airspeed indicator (M = 8.56%, SD = 5.45 for experts; M = 5.32%, SD = 4.98 for novices; U = 145.5, p < 0.01), the vertical speed indicator (M = 13.11%, SD = 8.84 for experts; M = 5.33%, SD = 4.11 for novices; U = 127, p < 0.01), and the altitude indicator (M = 2.83%, SD = 2.29 for experts; M = 1.34%, SD = 2.22 for novices; U = 159.5, p < 0.05) than novices. The difference in the gaze time at the engine speed indicator was insignificant between experts (M = 0.31%, SD = 0.56) and novices (M = 0.47%, SD = 1.16, p > 0.05 ). An independent-sample t-test was performed on the altitude indicator, and the result was marginally significant [t(44) = 1.843, p = 0.072, Cohen's d = 0.541]; novices (M = 31.03%, SD = 12.2) spent more time looking at the altitude indicator than experts (M = 25.15%, SD = 9.23).

As shown in Figure 8, compared with the novices, the experts' eye movement data were more evenly distributed around the mean value in the airspeed indicator (Figure 8A), the altitude indicator (Figure 8B), the vertical speed indicator (Figure 8C), and the altitude indicator (Figure 8D).

Figure 8
Figure 8: Means and data distribution of percent dwell time. (A) Airspeed indicator. (B) Altitude indicator. (C) Vertical speed indicator. (D) Altitude indicator. *p < 0.05, **p < 0.01, and ***p < 0.001. Please click here to view a larger version of this figure.

Flight performance
The data of the pitch angle 1 s before landing, the mean distance to the reference line, and the standard deviation of the distance to the reference line are not normally distributed, according to the Shapiro-Wilk test (all p-values <0.05 ). Thus, the Mann-Whitney U test was used. The pitch angle 1 s before landing for novices (M = -12.54°, SD = 29.03) was significantly smaller than that of experts (M = 3.97°, SD = 24.43; U = 130, p < 0.01), which indicated a more stable approach and better landing posture control for experts. The mean distance to the reference line for experts (M = 176.67 m, SD = 205.52) was significantly smaller than that of novices (M = 873.89 m, SD = 818.43; U = 439, p < 0.001), which represents a minor flying deviation. The standard deviation of the distance to the reference line for experts (M = 211.52 m, SD = 225.76) was also significantly smaller than that of novices (M = 675.78 m, SD = 589.07; U = 420, p < 0.01), which represents a more stable and reliable flight performance. These results showed that experts had a better flying performance than novices. An independent-sample t-test was performed on the total flight time, and the result showed a marginally significant difference [t(44) = 1.835, p = 0.076, Cohen's d = 0.541]; experts (M = 759.06 s, SD = 163.58) had a shorter total flight time than novices (M = 902.32 s, SD = 336.73).

As shown in Figure 9, compared with the novices, the experts' flight performance data was mainly concentrated around the mean value in the total flight time (Figure 9A), the mean distance to the reference line (Figure 9B), the pitch angle 1 s before landing (Figure 9C), and the standard deviation of the distance to the reference line (Figure 9D).

Figure 9
Figure 9: Means and data distribution. (A) The total flight time. (B) Mean distance to reference line. (C) Pitch angle 1 s before landing. (D) The standard deviation of the distance to the reference line. *p < 0.05, **p < 0.01. Please click here to view a larger version of this figure.

Supplementary File 1: The script file of airports' configuration. Please click here to download this File.

Discussion

The current study assessed if participants with flight experience had better flight performance than those without flight experience in a VR-based flight simulator. More importantly, it evaluated whether a more optimized eye movement pattern could be found in these participants with better flight performance. The results have significant differences between participants with and without flight experience in three key flying QAR indicators: pitch angle 1 s before landing, the mean distance to the reference line, and the standard deviation of the distance to the reference line (Figure 9BD). These results show that participants with flight experience have less flying deviation and more stable flight performance. At the same time, the results also have significant differences between participants with and without flight experience in three key instrument AOIs: airspeed indicator, vertical speed indicator, and altitude indicator (Figure 8AC). Further, participants with flight experience spend more time looking at critical flying instruments than those without flight experience.

In summary, flight expertise has an important impact on flight performance; experts' flight performance data shows their flight track is more concentrated around the center reference line, which means they have better flight performance. In contrast, novices' flight track deviates significantly from the center reference line. Furthermore, different flight expertise corresponds to different eye-scanning patterns; experts depend more on key flying instruments (i.e., airspeed indicator, vertical speed indicator, etc.) and have more gaze time on the devices. In other words, experts show a more structural and efficient eye-scanning pattern compared with novices. These results are consistent with Wickens' research, which shows that expert pilots have a better visual mode26.

As mentioned earlier, the VR flight simulator can measure flight performance. Still, some critical steps need more attention. First, during the demonstration flight session, the mission should be introduced along with the flight map. Second, the propeller aircraft is biased to the right in the process of flying, so the left flight pedal should be pressed when starting, and the flight joystick should be kept on the left during the flight to offset the right bias keep the aircraft flying smoothly. Third, the flight throttle in the takeoff phase should be pushed to the maximum, then held at two-thirds throttle in the level flight phase. Fourth, participants should be reminded to control the aircraft by matching the flight pedal, joystick, and throttle to make it fly as close to the guideline of the center reference line as possible. Fifth, whenever one exits the FlySimulator program, the experimenter must reset the flight launch switch and throttle.

Given researchers' diverse experimental needs, the protocol can be modified accordingly. First, the flight map can be easily changed as a practical scenario by configuring the script file (Supplementary File 1). Second, except for eye-movement data, electroencephalogram (EEG) data can be collected by integrating an EEG with dry electrodes in the HMD, as can heart rate data by a heart rate belt. Third, some subjective measures can be applied to the participants' psychology before and after the experiment by adding some scales and questionnaires, such as the proactive personality scale43, the measurement of situation awareness (SA)44,45, the VR sickness questionnaire (VRSQ)46, and the NASA task load index (TLX)47, etc. In addition, when the flight simulation program starts computing the dynamics model data in real-time, sometimes the aircraft body altitude is not in the correct place, and the experimenter needs to press the grey button in the upper right corner of the flight joystick, thus resetting it. A situation can also occur where the aircraft loses control, and the experimenter needs to exit for a reset by pressing Alt + F4 on the keyboard.

The method also has its limitations. A prominent issue is that participants get motion sickness when coping with a complex flight task in an immersive VR scenario, thus possibly withdrawing from the experiment. In addition, compared with a traditional flight simulator with motion bases4, the current study's VR flight simulator is not equipped with a motion device. It thus cannot produce the actual sensation of movement. Finally, the flight performance records various flight parameters, which can be used to evaluate the flight performance29; only three key indicators are selected in the current research.

In any case, the VR flight simulator has unique advantages concerning existing methods. Above all, considering conventional flight simulation relies on bulky and expensive instruments to create a virtual world9, the current flight simulator based on VR has a lower cost. Besides, VR technology enables keeping a person in a fully immersive virtual environment1, so the VR flight simulator provides more realistic immersion than other methods. More importantly, this method builds a new platform based on VR technology for pilot selection in the future, integrating eye-tracking and flight dynamics technology, and thus can provide more comprehensive screening metrics compared with current methods.

The VR flight simulator shows excellent potential as a pilot selection platform. It is well known that current pilot selection is expensive and overly dependent on expert opinion. The current VR flight simulator can be used in pilot selection, significantly reducing costs and providing more objective data for selection. In addition, the VR simulator can be used as a human factor engineering tool that enables ergonomic assessment of the flight cockpit in the early stages of the design process2. Finally, researchers can use this virtual simulator platform to study hand-eye coordination, fatigue, distraction, mind wandering37, etc., by designing different experiments.

Divulgations

The authors have nothing to disclose.

Acknowledgements

The authors are incredibly grateful to Mr. Li Yan for his help in recruiting pilot participants and acknowledge Ms. Bu Lingyun for her work on drawing pictures. The research was supported by the National Natural Science Foundation of China (grant number T2192931, 72071185), the National Brain Project (grant number STI2030-Major Projects2022ZD0208500), the National Key Laboratory Project of Human Factors Engineering (grant number SYFD062003), the National Key Laboratory Project of Human Factors Engineering (grant number 6142222210201), and year 2022 Major Projects of Military Logistic Research Grant and Key Project of Air Force Equipment Comprehensive Research (grant number KJ2022A000415).

Materials

3D engine SDK Epic Games Unreal Engine 4
GameAnalytics Unreal SDK
This SDK is a powerful yet flexible free analytics tool designed for games.
CPU Intel IntelCore i9 One of the most powerful CPU on the mainstream market.
Eye tracking SDK Tobii Tobii XR SDK This SDK provide device agnostic access to eye tracking data to allow development for headsets from many different hardware vendors and is not limited to devices using Tobii Eye Tracking hardware.
Eye tracking software Developed by the research team A program that tracks the movement of a person's eyes while they are using a virtual reality HMD.
FlySimulator program Developed by the research team A software that simulates flying experiences in a virtual environment, using VR HMD and hand-held controllers.
Graphics card NVIDIA GeForce RTX 3090
10496 NVIDIA CUDA Cores
1.70 GHz Boost Clock  
24 GB Memory Size
GDDR6X Memory Type
One of the most powerful graphics card on the mainstream market.
Operating system (OS) Microsoft Windows XP An operating system (OS) developed and exclusively distributed by Microsoft Corporation
Replica control panel THRUSTMASTER 2960720 2971004 2962072 2960748 2960769 U.S. Air Force A-10C attack aircraft HOTAS
Replica joystick THRUSTMASTER 2960720 U.S. Air Force A-10C attack aircraft HOTAS
Replica pedal THRUSTMASTER TPR pendular rudder
Replica throttle THRUSTMASTER U.S. Air Force A-10C attack aircraft HOTAS
Screen connected to PC Redmi RMMNT27NF, 27-inch, 1920 X 1080 resolution ratio Screen allows the experimenter to simultaneously view what is happening in the VR HMD
Screen recording software OBS Project OBS Studio Version 28.0 A free and open source software for video recording and live streaming
Statistical power analysis software Open-Source G*power Version 3.1.9.6 A free and user-friendly tool for estimating statistical power and sample size.
Statistical software IBM SPSS Version 24.0 A powerful statistical software platform
Versatile statistics tool GraphPad Software GraphPad Prism Version 9.4.0 A versatile statistics tool purpose-built for scientists-not statisticians
VR app store HTC Corporation VIVE Software 2.0.17.6 / 2.1.17.6 An app store for virtual reality where customers can explore, create, connect, and experience the content they love and need.
VR head-mounted display (HMD) HTC Corporation VIVE Pro Eye A VR headset with precision eye tracking
VR software Steam Steam VR Version 1.23 A tool for experiencing VR content on the hardware

References

  1. Oberhauser, M., Dreyer, D., Braunstingl, R., Koglbauer, I. What’s real about virtual reality flight simulation. Aviation Psychology and Applied Human Factors. 8 (1), 22-34 (2018).
  2. Oberhauser, M., Dreyer, D. A virtual reality flight simulator for human factors engineering. Cognition, Technology & Work. 19 (2-3), 263-277 (2017).
  3. Rolfe, J. M., Staples, K. J. . Flight Simulation. , (1986).
  4. Robinson, A., Mania, K., Perey, P. Flight simulation: Research challenges and user assessments of fidelity. Proceedings of the 2004 ACM SIGGRAPH International Conference on Virtual Reality Continuum and its Applications in Industry. , 261-268 (2004).
  5. Moroney, W. F., Moreney, B. W. Flight Simulation. Handbook of Aviation Human Factors. , 261-268 (1999).
  6. McCarty, W. D., Sheasby, S., Amburn, P., Stytz, M. R., Switzer, C. A virtual cockpit for a distributed interactive simulation. IEEE Computer Graphics and Applications. 14 (1), 49-54 (1994).
  7. Dorr, K. U., Schiefel, J., Kubbat, I. Virtual cockpit simulation for pilot training. In . The Hague, The Netherlands. What is Essential for Virtual Reality Systems to Meet Military Human Performance Goals? RTO human factors and medicine panel (HEM) workshop. , (2001).
  8. Bauer, M., Klingauf, U. Virtual-reality as a future training medium for civilian flight procedure training. AIAA Modeling and Simulation Technologies Conference and Exhibit. , 18-21 (2008).
  9. Yavrucuk, I., Kubali, E., Tarimci, O. A low cost flight simulator using virtual reality tools. IEEE Aerospace and Electronic Systems Magazine. 26 (4), 10-14 (2011).
  10. Aslandere, T., Dreyer, D., Pankratz, F., Schubotz, R. A generic virtual reality flight simulator. Virtuelle und Erweiterte Realität, 11. Workshop der GI-Fachgruppe VR/AR. , 1-13 (2014).
  11. Joyce, R. D., Robinson, S. K. The rapidly reconfigurable research cockpit. AIAA Modeling and Simulation Technologies Conference. , 22-26 (2015).
  12. Schijven, M. P., Jakimowicz, J. J., Carter, F. J. How to select aspirant laparoscopic surgical trainees: Establishing concurrent validity comparing Xitact LS500 index performance scores with standardized psychomotor aptitude test battery scores. The Journal of Surgical Research. 121 (1), 112-119 (2004).
  13. Huang, P., Zhu, X., Liu, X., Xiao, W., Wu, S. . Psychology selecting device for air force pilot recruitment based on virtual reality technology, has industrial personal computer connected with memory, where industrial control computer is connected with image display device. , (2020).
  14. Wojciechowski, P., Wojtowicz, K. Simulator sickness and cybersickness as significant indicators in a primary selection of candidates for FPV drone piloting. 2022 IEEE 9th International Workshop on Metrology for AeroSpace (MetroAeroSpace). , (2022).
  15. Ziv, G. Gaze behavior and visual attention: A review of eye tracking studies in aviation. The International Journal of Aviation Psychology. 26 (3-4), 75-104 (2016).
  16. Lai, M. L., et al. A review of using eye-tracking technology in exploring learning from 2000 to 2012. Educational Research Review. 10, 90-115 (2013).
  17. Robinski, M., Stein, M. Tracking visual scanning techniques in training simulation for helicopter landing. Journal of Eye Movement Research. 6 (2), 1-17 (2013).
  18. Yang, J. H., Kennedy, Q., Sullivan, J., Fricker, R. D. Pilot performance: Assessing how scan patterns & navigational assessments vary by flight expertise. Aviation Space and Environmental Medecine. 84 (2), 116-124 (2013).
  19. Yu, C. S., Wang, E. M. Y., Li, W. C., Braithwaite, G., Greaves, M. Pilots’ visual scan patterns and attention distribution during the pursuit of a dynamic target. Aerospace Medicine and Human Performance. 87 (1), 40-47 (2016).
  20. Haslbeck, A., Zhang, B. I spy with my little eye: Analysis of airline pilots’ gaze patterns in a manual instrument flight scenario. Applied Ergonomics. 63, 62-71 (2017).
  21. Brams, S., et al. Does effective gaze behavior lead to enhanced performance in a complex error-detection cockpit task. PLoS One. 13 (11), e0207439 (2018).
  22. Peißl, S., Wickens, C. D., Baruah, R. Eye-tracking measures in aviation: A selective literature review. The International Journal of Aerospace Psychology. 28 (3-4), 98-112 (2018).
  23. Jin, H., et al. Study on how expert and novice pilots can distribute their visual attention to improve flight performance. IEEE Access. 9, 44757-44769 (2021).
  24. Lounis, C., Peysakhovich, V., Causse, M. Visual scanning strategies in the cockpit are modulated by pilots’ expertise: A flight simulator study. PLoS One. 16 (2), e0247061 (2021).
  25. Bellenkes, A. H., Wickens, C. D., Kramer, A. F. Visual scanning and pilot expertise: The role of attentional flexibility and mental model development. Aviation Space and Environmental. 68 (7), 569-579 (1997).
  26. Kasarskis, P., Stehwien, J., Hickox, J., Aretz, A., Wickens, C. Comparison of expert and novice scan behaviors during VFR flight. Proceedings of the 11th International Symposium on Aviation Psychology. , (2001).
  27. Lorenz, B., et al. Performance, situation awareness, and visual scanning of pilots receiving onboard taxi navigation support during simulated airport surface operation. Human Factors and Aerospace Safety. 6 (2), 135-154 (2006).
  28. Horrey, W. J., Alexander, A. L., Wickens, C. D. Does workload modulate the effects of in-vehicle display location on concurrent driving and side task performance. Driving Simulator Conference North America Proceedings. , (2013).
  29. Wang, L., Ren, Y., Sun, H., Dong, C. A landing operation performance evaluation method and device based on flight data. In Engineering Psychology and Cognitive Ergonomics: Cognition and Design. , 297-305 (2017).
  30. Wang, L., Ren, Y., Wu, C. Effects of flare operation on landing safety: A study based on ANOVA of real flight data. Safety Science. 102, 14-25 (2018).
  31. Huang, R., Sun, H., Wu, C., Wang, C., Lu, B. Estimating eddy dissipation rate with QAR flight big data. Applied Sciences. 9 (23), 5192 (2019).
  32. Wang, L., Zhang, J., Dong, C., Sun, H., Ren, Y. A method of applying flight data to evaluate landing operation performance. Ergonomics. 62 (2), 171-180 (2019).
  33. Gerathewohl, S. J. Leitfaden der Militärischen Flugpsychologie. Verlag für Wehrwissenschaften. , (1987).
  34. Ugwitz, P., Kvarda, O., Juříková, Z., Šašinka, &. #. 2. 6. 8. ;., Tamm, S. Eye-tracking in interactive virtual environments: implementation and evaluation. Applied Sciences. 12 (3), 1027 (2022).
  35. Faul, F., Erdfelder, E., Lang, A. -. G., Buchner, A. G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods. 39 (2), 175-191 (2007).
  36. Boslaugh, S. E. . Snellen Chart. , (2018).
  37. He, J., Becic, E., Lee, Y. -. C., McCarley, J. S. Mind wandering behind the wheel. Human Factors: The Journal of the Human Factors and Ergonomics Society. 53 (1), 13-21 (2011).
  38. Tanveer Alam, . GitHub – tanvcodes/qar_analytics: Scripts for working with publicly available Quick Access Recorder (QAR) data from a fleet of 35 BAE-146 aircraft. GitHub. , (2022).
  39. Shapiro, S. S., Wilk, M. B. An analysis of variance test for normality (complete samples). Biometrika. 52 (3-4), 591-611 (1965).
  40. Hintze, J. L., Nelson, R. D. Violin plots: A box plot-density trace synergism. The American Statistician. 52 (2), 181-184 (1998).
  41. Cohen, J. . Statistical Power Analysis for the Behavioral Sciences (2nd ed.). , (1988).
  42. Sawilowsky, S. S. New effect size rules of thumb. Journal of Modern Applied Statistical Methods. 8 (2), 26 (2009).
  43. Bateman, T. S., Crant, J. M. The proactive component of organizational behavior: A measure and correlates. Journal of Organizational Behavior. 14 (2), 103-118 (1993).
  44. Endsley, M. R. Measurement of situation awareness in dynamic systems. Human Factors. 37 (1), 65-84 (1995).
  45. Hunter, D. R. Measuring general aviation pilot judgment using a situational judgment technique. The International Journal of Aviation Psychology. 13 (4), 373-386 (2003).
  46. Kim, H. K., Park, J., Choi, Y., Choe, M. Virtual reality sickness questionnaire (VRSQ): Motion sickness measurement index in a virtual reality environment. Applied Ergonomics. 69, 66-73 (2018).
  47. Hart, S. G. . NASA Task Load Index (TLX). , (1986).

Play Video

Citer Cet Article
Ke, L., Zhang, Z., Ma, Y., Xiao, Y., Wu, S., Wang, X., Liu, X., He, J. Evaluating Flight Performance and Eye Movement Patterns Using Virtual Reality Flight Simulator. J. Vis. Exp. (195), e65170, doi:10.3791/65170 (2023).

View Video