SOURCE: NEW SCIENTIST
Pilots in India are testing aircraft display systems that work by tracking and responding to eye movements and could let military pilots keep their hands on the plane’s controls more often while flying.
Modern aircraft have electronic display systems that show information such as the plane’s fuel level, imaging system or geographical position. Pilots can click the screen to the relevant page of information as needed, but this requires taking one hand either off the plane’s throttle or control stick.
Through unique eye- and face-tracking sensor technologies, with a key focus on alertness and attention, Seeing Machines technology enables key data and metrics to understand and improve human performance. This technology is unique. It is a non-wearable, unobtrusive system that can be adapted and integrated into various real-world environments to provide new data to optimise training and operational monitoring. This technology will challenge the ‘current way’ and provide a new perspective, with data that focuses on the human element, their alertness and performance.
Insufficient knowledge of automation behaviour, mode confusion and loss of awareness, poor scanning techniques and over-confidence and trust in automation are all recognised as prevalent in today’s on and offshore helicopter crews. Poor visual scanning signals an emerging split between the pilot and the automated system, which leads to adverse safety outcomes. Inappropriate use of autopilot modes was cited in the fatal ditching of an Airbus AS332 L2 Super Puma near Sumburgh Airport, UK, in 2013. Addressing and rectifying these deficiencies is, however, quite difficult.
Seeing Machines has pioneered the development and commercialisation of proprietary algorithms and hardware that helps machines to interpret the human face and eyes in order to understand their state.
The non-intrusive, fully automatic, camera-based technologies do not require the user to wear any form of hardware or sensors. The technology detects and locates a human face and then tracks in real-time without any form of pre-calibration or prior knowledge of the subject. The system can immediately provide a variety of accurate head- and eye-related data, measures and metrics including gaze tracking, microsleep detection and very accurate measures of pupil diameter. These core signals can then be integrated into mutually developed visualisations, or further interpreted to develop higher-level signals to support the use case of the carrier/operator.
The Seeing Machines data output that can be produced both in real-time or saved for post-run review/debrief is relevant to several key major aviation initiatives, including the following:
- IATA (International Aviation Transport Association) Initiative of Evidence-based Training (EBT) from 2013 – building on over 20 years of flight operations.
- A new evidence-based paradigm for competency-based training and assessment.
- EBT Aim – identify, develop and evaluate the core competencies required by pilots to operate safely, effectively, and efficiently in a commercial air transport environment by managing the most relevant threats and errors, based on evidence collected in operations and training.
- Pilot gaze tracking – a key enabler and contributor to enhanced EBT:
- New training data – specific to where a trainee is looking.
- Provides improved instructional awareness.
- Supports early evidence-based identification, evaluation, development/correction and refinement of pilot competencies related to operation of the specific aircraft type.