Motion Estimation of Handheld Optical Coherence Tomography System using Real-Time Eye Tracking System

Abira Bright B. (Login required)
SRM Institute of Science and Technology, Kattankulathur, Tamil Nadu, India

Lakshmi Parvathi M.
SRM Institute of Science and Technology, Kattankulathur, Tamil Nadu, India

Vani Damodaran
SRM Institute of Science and Technology, Kattankulathur, Tamil Nadu, India

Paper #8306 received 28 Feb 2023; revised manuscript received 11 Aug 2023; accepted for publication 16 Aug 2023; published online 15 Sep 2023.


Optical coherence tomography (OCT) is the clinical golden standard for cross-sectional imaging of the eye. The majority of clinical ophthalmic OCT systems are table-top devices that need the patient to align with the chinrest in order to capture a motion-free image. Portable OCT devices are used to perform retinal imaging on infants or patients who are confined to beds. Eye movements and relative motion between the patient and the imaging probe make interpretation and registration challenging and become a barrier to high-resolution ocular imaging. Thus, an OCT scanner with an automated real-time eye tracking system and a movement mapping for correction mechanism is required to overcome such motions. The aim of this work is to develop an algorithm to track pupil motion and allow motion-corrected imaging of the retina without the requirement of chinrest, fixation of the target, or seating chair and to minimize the requirement of skillset to operate and to correct motion artifacts. Two algorithms based on landmark and threshold were developed, capable of identifying and monitoring eye movements. The acquired output value of both algorithms was compared with the manually calculated actual center value of the pupil. The average deviation from the actual location was found to be 0.2~0.6 for the landmark and 0.4~0.9 for the threshold-based algorithm. In this study, it is observed that iris localization and gaze direction estimation is more accurate in the landmark-based system compared to the threshold-based eye-tracking system.


optical coherence tomography (OCT); depth Camera; eye tracking; gaze tracking

Full Text:



1. V. Mazlin, P. Xiao, K. Irsch, J. Scholler, K. Groux, K. Grieve, M. Fink, and A. C. Boccara, “Optical phase modulation by natural eye movements: application to time-domain FF-OCT image retrieval,” Biomedical Optics Express 13(2), 902–920 (2022).

2. J. A. Izatt, M. R. Hee, E. A. Swanson, C. P. Lin, D. Huang, J. S. Schuman, C. A. Puliafito, and J. G. Fujimoto, “Micrometer-Scale Resolution Imaging of the Anterior Eye In Vivo With Optical Coherence Tomography,” Archives of Ophthalmology 112(12), 1584–1589 (1994).

3. E. A. Swanson, J. A. Izatt, M. R. Hee, D. Huang, C. P. Lin, J. S. Schuman, C. A. Puliafito, and J. G. Fujimoto, “In vivo retinal imaging by optical coherence tomography,” Optics Letters 18(21), 1864–1866 (1993).

4. U. Schmidt-Erfurth, S. Klimscha, S. M. Waldstein, and H. Bogunović, “A view of the current and future role of optical coherence tomography in the management of age-related macular degeneration,” Eye 31(1), 26–44 (2017).

5. W. Jung, J. Kim, M. Jeon, E. J. Chaney, C. N. Stewart, and S. A. Boppart, “Handheld Optical Coherence Tomography Scanner for Primary Care Diagnostics,” IEEE Transactions on Biomedical Engineering 58(3), 741–744 (2011).

6. N. D. Shemonski, F. A. South, Y.-Z. Liu, S. G. Adie, P. S. Carney, and S. A. Boppart, “Computational high-resolution optical imaging of the living human retina,” Nature Photonics 9(7), 440–443 (2015).

7. P. J. Rosenfeld, M. K. Durbin, L. Roisman, F. Zheng, A. Miller, G. Robbins, K. B. Schaal, and G. Gregori, “ZEISS AngioplexTM Spectral Domain Optical Coherence Tomography Angiography: Technical Aspects,” in OCT Angiography in Retinal and Macular Diseases, F. Bandello, E. H. Souied, and G. Querques (Eds.), S. Karger AG, 56, 18–29 (2016).

8. H. Singh, J. Singh, “Human eye tracking and related issues: A review,” International Journal of Scientific and Research Publications 2(9), 1–9 (2012).

9. Z. R. Cherif, A. Nait-Ali, J. F. Motsch, and M. O. Krebs, “An adaptive calibration of an infrared light device used for gaze tracking,” in IMTC/2002. Proceedings of the 19th IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.00CH37276), 2, Anchorage, AK, USA, 1029–1033 (2002).

10. W. Wang, Y. Huang, and R. Zhang, “Driver gaze tracker using deformable template matching,” In Proceedings of 2011 IEEE International Conference on Vehicular Electronics and Safety, Beijing, China, 244–247 (2011).

11. N. H. Cuong, H. T. Hoang, “Eye-gaze detection with a single WebCAM based on geometry features extraction,” In 2010 11th International Conference on Control Automation Robotics & Vision, Singapore, 2507–2512 (2010).

12. S. Das, S. K. Swar, S. Laha, S. Mahindar, S. Halder, H. Koushik, and S. Deb, “Design approach of Eye Tracking and Mind Operated Motorized System,” International Journal of Innovative Research in Science, Engineering and Technology (An ISO 3297: 2007 Certified Organization) 5(8), 14349–14357 (2016).

13. I. García, S. Bronte, L. M. Bergasa, N. Hernández, B. Delgado, and M. Sevillano, “Vision-based drowsiness detector for a realistic driving simulator,” In 13th International IEEE Conference on Intelligent Transportation Systems, Funchal, Portugal, 887–894 (2010).

14. B. S. Kim, H. Lee, and W. Y. Kim, “Rapid eye detection method for non-glasses type 3D display on portable devices,” IEEE Transactions on Consumer Electronics 56(4), 2498–2505 (2010).

15. D. E. King, “Dlib-ml: A machine learning toolkit,” The Journal of Machine Learning Research 10, 1755–1758 (2009).

© 2014-2023 Samara National Research University. All Rights Reserved.
Public Media Certificate (RUS). 12+