Gaze-Point Projection by Marker-based localization of Eye-Tracking Glasses for Examining Trust in Automated Vehicles
Conference: AmE 2022 – Automotive meets Electronics - 13. GMM-Symposium
09/29/2022 - 09/30/2022 at Dortmund, Germany
Proceedings: GMM-Fb. 104: AmE 2022
Pages: 2Language: englishTyp: PDF
Authors:
Albers, Franz; Bertram, Torsten (TU Dortmund University, Institute of Control Theory and Systems Engineering, Dortmund, Germany)
Abstract:
Automated driving promises improved traffic safety and comfort for users. While fully driverless vehicles according to SAE Level 5 are aspired for the future, necessary technologies and legislations are still under research and development. Recently, the first conditionally automated vehicles according to SAE Level 3 were legally permitted and are scheduled for release in Germany. Conditionally automated vehicles allow the user for the first time to engage in non-driving related tasks (NDRT) while driving. However, whenever a system limit is reached, the user is expected to take over the driving task within a reasonable time. The sudden change of role from distracted passenger to attentive driver constitutes a challenging task for many users. Thus, driver monitoring systems are gaining importance and are becoming obligatory for new vehicles in the EU in 2022. Being driven by an automated system can feel uncommon for many people, which makes trust in and acceptance of automated vehicles very important and major research topics. Eye-tracking based approaches are among the most promising methods since they provide good results for forward looking drivers and non-intrusive external eye-tracking systems exist. However, external systems tend to fail whenever the driver turns his head (e.g. shoulder check) and eye-tracking glasses usually provide less missed and better accuracy for pupil-related measurements like the pupil diameter, which e.g. can be used to estimate the cognitive workload of a user with the Low/High Index of Pupillary Activity (LHIPA). On the downside portable eye-tracking systems have no fixed coordinate frame, which makes image processing a requirement for utilizing the gaze tracking data in online applications.