The 7th International Workshop on Eye and Gaze in Computer Vision (GAZE 2026) at CVPR 2026 aims to encourage and highlight novel strategies for eye gaze estimation and prediction. The workshop topics include (but are not limited to):
- Foundation models and large-scale training for the eye and gaze.
- Gaze in egocentric vision, physical AI learning, and human–robot interaction.
- Understanding gaze in social interactions, human activities, and telepresence scenarios involving real or virtual agents and entities.
- Gaze estimation algorithms, including 3D gaze estimation, point-of-regard estimation, gaze following, gaze zone classification, etc.
- Detection and segmentation of the eye region, such as eye detection, pupil detection, eye-region landmark localization, etc.
- Human eye modeling and generation, including synthesis and animation from images or videos, etc.
- Eye gaze data collection, generation, and analysis, such as scanpath generation, etc.
- Applications of gaze tracking and analysis in real-world scenarios, including VR/AR, mobile devices, PCs, etc.
Call for Contributions
Full Workshop Papers
Submission: We invite authors to submit unpublished papers (8-page CVPR format) to our workshop, to be presented at a poster session upon acceptance. All submissions will go through a double-blind review process. All contributions must be submitted (along with supplementary materials, if any) on OpenReview (The link will be provided soon).
Accepted papers will be published in the official CVPR Workshops proceedings and the Computer Vision Foundation (CVF) Open Access archive.
Note: Authors of previously rejected main conference submissions are also welcome to submit their work to our workshop. When doing so, you must submit the previous reviewers' comments (named as previous_reviews.pdf) and a letter of changes (named as letter_of_changes.pdf) as part of your supplementary materials to clearly demonstrate the changes made to address the comments made by previous reviewers.
University of Birmingham
NVIDIA Research
Delft University of Technology
ETH Zürich

EPFL & Idiap Research Institute

NVIDIA Research

Microsoft
EPFL & Idiap Research Institute
University of Birmingham
NVIDIA Research

Image credit to 