A Method for Assessing the Pupil Center Coordinates in Eyetracking with a Free Head Position

Gennadiy I. Gromilin orcid (Login required)
Institute of Automation and Electrometry SB RAS, Novosibirsk, Russia

Nikolay S. Yakovenko
Institute of Automation and Electrometry SB RAS, Novosibirsk, Russia


Paper #3431 received 25 May 2021; revised manuscript received 6 Sep 2021; accepted for publication 6 Sep 2021; published online 30 Sep 2021

DOI: 10.18287/JBPE21.07.030302

Abstract

IR-illuminated Eyetracking systems include cornea reflection and pupil center coordinates detection to calculate the operator’s gaze fixation point. When you turn a view of a large angle, some of the frames are blurred, and the coordinates are unreliable. The article describes a method for determining the center of the pupil in the gaze fixation system for operation at an increased camera frame rate. Comparison with known algorithms is given. The algorithm execution average time is about 1.2 ms on a typical office computer by processing images in fragments of the order of 340x240 pixels.

Keywords

Eyetracking system; cornea reflection; pupil center; aiming point, view direction

Full Text:

PDF

References


1. P. Biswas, J. DV, “Eye Gaze Controlled MFD for Military Aviation,” In 23rd International Conference on Intelligent User Interfaces, 7–11 March 2018, Tokyo, Japan, 79–89 (2018).

2. J. P. Hansen, A. Alapetite, I. S. MacKenzie, and E. Møllenbach, “The use of gaze to control drones,” Proceedings of the ACM Symposium on Eye Tracking Research and Applications, 27–34 (2014).

3. T. E. Petrova, E. I. Riekhakaynen, and V. S. Bratash, “An Eye-Tracking Study of Sketch Processing: Evidence From Russian,” Frontiers in Psychology 11, 297 (2020).

4. E. A. Novikov, I. A. Vakoliuk, R. D. Akhapkin, I. A. Varchak, I. G. Shalanginova, D. A. Shvaiko, and E. A. Budenkova, “Automation method of computer oculoghaphy for research of the central nervous system based on passive video analysis,” Machine Learning and Data Analysis 1(12), 1–12 (2015).

5. L. Cercenelli, E. Marcelli, “Eye Tracking in Ophthalmology: A Glimpse Towards Clinical Practice,” EC Ophthalmology ECO 01, 16–18 (2017).

6. P. Chandon, J. W. Hutchinson, E. T. Bradlow, and S. H. Young, “Measuring the Value of Point-of-Purchase Marketing with Commercial Eye-Tracking Data,” SSRN Electronic Journal (2007).

7. R. S. Hessels, C. Kemner, C. van den Boomen, and I. T. C. Hooge, “The area-of-interest problem in eyetracking research: A noise-robust solution for face and sparse stimuli,” Behavior Research Methods 48, 1694–1712 (2016).

8. D. S. Raupov, O. O. Myakinin, I. A. Bratchenko, V. P. Zakharov, and A. G. Khramov, “Multimodal texture analysis of OCT images as a diagnostic application for skin tumors,” Journal of Biomedical Photonics & Engineering 3(1), 010307 (2017).

9. M. A. Fischler, R. C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Communications of the ACM 24(6), 381–395 (1981).

10. A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Transactions on Pattern Analysis and Machine Intelligence 21(5), 476–480 (1999).

11. Satoshi Suzuki, K. Abe, “Topological structural analysis of digitized binary images by border following,” Computer Vision, Graphics, and Image Processing 30(1), 32–46 (1985).

12. OpenCV (accessed 24 May 2021) [https://opencv.org/].

13. D. Li, D. J. Parkhurst, “Starburst: A robust algorithm for video-based,” Elselvier Science (2005).

14. A.-H. Javadi, Z. Hakimi, M. Barati, V. Walsh, and L. Tcheang, “SET: a pupil detection method using sinusoidal approximation,” Frontiers in Neuroengineering 8, 4 (2015).






© 2014-2025 Authors
Public Media Certificate (RUS
). 12+