Gaze Biometrics
Robustness of Eye Movement Biometrics Against Varying Stimuli and Varying Trajectory Length
Recent results suggest that biometric identification based on human’s eye movement characteristics can be used for authentication. In this paper, we present three new methods and benchmark them against the state-of-the-art. The best of our new methods improves the state-of-the-art performance by 5.9 percentage points. Furthermore, we investigate some of the factors that affect the robustness of the recognition rate of different classifiers on gaze trajectories, such as the type of stimulus and the tracking trajectory length. We find that the state-of-the-art method only works well when using the same stimulus for testing that was used for training. By contrast, our novel method more than doubles the identification accuracy for these transfer cases. Furthermore, we find that with only 90 seconds of eye tracking data, 86.7 % accuracy can be achieved.
Main Contributions
- We present two extensions of the method by George and Routray [10], which is, to our knowledge, the best classifier, at least for weakly task-independent scenarios. One extension uses more features, the other one uses a different classifier. In total, we evaluate and compare four different methods in this paper.
- We analyze the effect of different tracking lengths on classification performance, which is important for potential real-world application of eye tracking biometrics.
- To the best of our knowledge, we are the first to compare the stimulus-agnostic performance of gaze biometrics methods.
- We make an exact re-implementation of the method by George and Routray and all our methods publicly available as python module ( https://Blank.for.anonymous.review ).
Publications
- Robustness of Eye Movement Biometrics Against Varying Stimuli and Varying Trajectory Length, CHI 2020, Honolulu, Hawaiʻi, April 25 - 30, 2020.
- German HCI (Booklet-CHI2020.pdf)
Files
License
This original work is copyright by University of Bremen.
Any software of this work is covered by the European Union Public Licence v1.2.
To view a copy of this license, visit
eur-lex.europa.eu.
The Thesis provided above (as PDF file) is licensed under Attribution-NonCommercial-NoDerivatives 4.0 International.
Any other assets (3D models, movies, documents, etc.) are covered by the
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
To view a copy of this license, visit
creativecommons.org.
If you use any of the assets or software to produce a publication,
then you must give credit and put a reference in your publication.
If you would like to use our software in proprietary software,
you can obtain an exception from the above license (aka. dual licensing).
Please contact zach at cs.uni-bremen dot de.