Sunday Morning, 18th June 2023 (half-day)


Introduction

The 5th International Workshop on Gaze Estimation and Prediction in the Wild (GAZE 2023) at CVPR 2023 aims to encourage and highlight novel strategies for eye gaze estimation and prediction with a focus on robustness and accuracy in extended parameter spaces, both spatially and temporally. This is expected to be achieved by applying novel neural network architectures, incorporating anatomical insights and constraints, introducing new and challenging datasets, and exploiting multi-modal training. Specifically, the workshop topics include (but are not limited to):

  • Reformulating eye detection, gaze estimation, and gaze prediction pipelines with deep networks.
  • Applying geometric and anatomical constraints into the training of (sparse or dense) deep networks.
  • Leveraging additional cues such as contexts from face region and head pose information.
  • Developing adversarial methods to deal with conditions where current methods fail (illumination, appearance, etc.).
  • Exploring attention mechanisms to predict the point of regard.
  • Designing new accurate measures to account for rapid eye gaze movement.
  • Novel methods for temporal gaze estimation and prediction including Bayesian methods.
  • Integrating differentiable components into 3D gaze estimation frameworks.
  • Robust estimation from different data modalities such as RGB, depth, head pose, and eye region landmarks.
  • Generic gaze estimation method for handling extreme head poses and gaze directions.
  • Temporal information usage for eye tracking to provide consistent gaze estimation on the screen.
  • Personalization of gaze estimators with few-shot learning.
  • Semi-/weak-/un-/self- supervised leraning methods, domain adaptation methods, and other novel methods towards improved representation learning from eye/face region images or gaze target region images.
We will be hosting 2 invited speakers for the topic of gaze estimation. We will also be accepting the submission of full unpublished papers as done in previous versions of the workshop. These papers will be peer-reviewed via a double-blind process, and will be published in the official workshop proceedings and be presented at the workshop itself. More information will be provided as soon as possible.


Call for Contributions


Full Workshop Papers

Submission: We invite authors to submit unpublished papers (8-page CVPR format) to our workshop, to be presented at a poster session upon acceptance. All submissions will go through a double-blind review process. All contributions must be submitted (along with supplementary materials, if any) at this CMT link.

Accepted papers will be published in the official CVPR Workshops proceedings and the Computer Vision Foundation (CVF) Open Access archive.

Note: Authors of previously rejected main conference submissions are also welcome to submit their work to our workshop. When doing so, you must submit the previous reviewers' comments (named as previous_reviews.pdf) and a letter of changes (named as letter_of_changes.pdf) as part of your supplementary materials to clearly demonstrate the changes made to address the comments made by previous reviewers.



Important Dates


Paper Submission Deadline March 10, 2023 (12:00 Pacific time)
Notification to Authors Mar 31, 2023
Camera-Ready Deadline April 8, 2023


Workshop Schedule


TBD
Time in UTC Start Time in UTC*
(probably your time zone)
Item
* This time is calculated to be in your computer's reported time zone.
For example, those in Los Angeles may see UTC-7,
while those in Berlin may see UTC+2.

Please note that there may be differences to your actual time zone.


Invited Keynote Speakers


TBD

Awards

TBD


Program Committee

TBD


Organizers



Hyung Jin Chang
University of Birmingham
Xucong Zhang
Delft University of Technology
Shalini De Mello
NVIDIA Research
Seonwook Park
Lunit Inc.

Otmar Hilliges
ETH Zürich
Aleš Leonardis
University of Birmingham

Website Chair



Hengfei Wang
University of Birmingham


Please contact me if you have any question about this website.
Email: hxw080@student.bham.ac.uk

Workshop sponsored by: