An Accurate and Robust Gaze Estimation Method Based on Maximum Correntropy Criterion

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Accurately estimating the user's gaze is important in many applications, such as human-computer interaction. Due to great convenience, appearance-based methods for gaze estimation have been a popular subject of research for many years. However, the greatest challenges in the appearance-based gaze estimation in a desktop environment are how to simplify the calibration process and deal with other issues such as image noise and low resolution. To address the problems, we adopt a mapping relationship between the high-dimensional eye image features space and the low-dimensional gaze positions and propose a robust and accurate method for gaze estimation with a webcam. First, we utilize Kullback-Leibler divergence to reduce feature dimension and keep similarity between the feature space and the gaze space. Then, we construct the objective function using the maximum correntropy criterion instead of mean squared error, which can enhance the anti-noise ability, especially for outliers or pixel corruption. A regularization term is adopted to adaptively select the sparse training samples for gaze estimation. We conducted extensive experiments in a desktop environment, which verified that the proposed method was robust and efficient in dealing with sparse training samples, pixel corruption, and low-resolution problems in gaze estimation.

Original languageEnglish
Article number8629993
Pages (from-to)23291-23302
Number of pages12
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Keywords

  • Appearance-based method
  • gaze estimation
  • human computer interaction
  • maximum correntropy criterion

Fingerprint

Dive into the research topics of 'An Accurate and Robust Gaze Estimation Method Based on Maximum Correntropy Criterion'. Together they form a unique fingerprint.

Cite this