About Me

Bio

I am a research scientist at FX Palo Alto Laboratory (FXPAL). My current research focuses on human behavior sensing and analysis. Prior to my position at FXPAL, I was a post-doc researcher in the Pattern Recognition & Bioinformatics Group in TU Delft. I was a research fellow at the Royal Institute of Technology (KTH) in Sweden, in the Computer Vision and Active Perception Lab (CVAP), where I investigated natural human-robot interaction and social robotics.

I received my PhD degree in Computer Science from Lancaster University in 2015, where I was also a Marie Curie research fellow. While in Lancaster, I investigated novel video-based gaze estimation techniques, gaze-based interaction for large displays, and using eye tracking for detecting dementia and health monitoring. I obtained my Master’s degree in Artificial Intelligence from the University of Amsterdam, where I was awarded the Huygens Scholarship.

Research Interests

Human Behaviour Analysis
Ubiquitous Computing
Human-computer Interaction
Computer Vision
Machine Learning and Pattern Recognition

Past Projects

Resourceful Ageing, Exploring data-driven and machine learning approaches to design novel IoT services for ageing populations, TU Delft, 2016 – 2018
MODEM (Monitoring Of Dementia using Eye Movements), Lancaster University, 2015
Combining Gaze and Hand Gestures for Remote Interaction, Microsoft Research Cambridge, June 2014 – August 2014
iCareNet (Intelligent Context-Aware Systems for Healthcare, Wellness, and Assisted Living), Lancaster University, 2011 – 2014
EAR (Eye-based Activity Recognition), Lancaster University, 2011
Multi-people Tracking using Graph Representation with Ceiling Mounted Video Cameras, Philips Research Eindhoven, January 2010 – October 2010

Research Services

I served as the Paper & Proceedings chair for the 5th ACM Symposium on Spatial User Interaction (SUI 2017). Previously, I co-organised the 5th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (sponsored by 3 industry partners) with academics from ETH Zurich and Max-Planck Institute for Informatics in 2015. I served on the program committee for the 3rd and the 6th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction. I served on the program committee of the IEEE conference on Automatic Face and Gesture Recognition (FG) from 2017 to 2019. Frequently, I get invited to review submissions for international conferences (e.g., CHI, UIST, UbiComp, HRI, MobileHCI), Journals (e.g., Sensors, PMC, IJHCS, IMWUT, TMC, PLOS One) and magazines.

Contact

yzhang [at] fxpal [dot] com

Publications

Journal Papers

TeamSense: Assessing Personal Affect and Group Cohesion in Small Teams Through Dyadic Interaction and Behavior Analysis with Wearable Sensors pdf
Yanxia Zhang, Jeffrey Olenick, Chu-Hsiang Chang, Steve W. J. Kozlowski, and Hayley Hung
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT/UbiComp’18),2, 3, Article 150 (September 2018), 22 pages.

Look Together: Using Gaze for Assisting Co-located Collaborative Search pdf
Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling and Hans Gellersen
In Personal and Ubiquitous Computing (PUC), 21(1):173–186, 2017. Impact factor: 2.395

Eye Tracking for Public Displays in the Wild pdf
Yanxia Zhang, Ming Ki Chong, Jörg Müller, Andreas Bulling and Hans Gellersen
In Personal and Ubiquitous Computing (PUC), August 2015, Volume 19, Issue 5-6, pages 967-981, 2015. Impact factor: 2.395

Conference Papers

Using Topic Models to Mine Everyday Object Usage Routines through Connected IoT Sensors
Yanxia Zhang and Hayley Hung
Proceedings of the 8th International Conference on the Internet of Things (IOT’18), pages 27:1-27:4, 2018.

The I in Team: Mining Personal Social Interaction Routine with Topic Models from Long-Term Team Data pdf
Yanxia Zhang, Jeffrey Olenick, Chu-Hsiang Chang, Steve W. J. Kozlowski, and Hayley Hung
Proceedings of the 23rd International Conference on Intelligent User Interfaces (IUI’18), pages 421-426, 2018. Acceptance rate: 23%

Look but don’t stare: Mutual Gaze Interaction in Social Robots pdf
Yanxia Zhang, Jonas Beskow, and Hedvig Kjellström
Proceedings of the 9th International Conference on Social Robotics (ICSR’17), pages 556–566, 2017.

Estimating Verbal Expressions of Task and Social Cohesion in Meetings by Quantifying Prosodic Mimicry pdf
Marjolein Nanninga, Yanxia Zhang, Nale Lehmann-Willenbrock, Zoltán Szlávik, and Hayley Hung
Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI’17), pages 206–215, 2017. Acceptance rate: 17% (oral)

Monitoring Dementia with Automatic Eye Movements Analysis pdf
Yanxia Zhang, Thomas Wilcockson, Kwang In Kim, Trevor Crawford, Hans Gellersen, and Pete Sawyer.
Proceedings of the 8th KES International Conference on Intelligent Decision Technologies (KES-IDT’16) – Part II, pages 299–309. Springer International Publishing, 2016.

Gaussian Processes as an Alternative to Polynomial Gaze Estimation Functions pdf
Laura Sesma-Sanchez, Yanxia Zhang, Andreas Bulling and Hans Gellersen
Proceedings of the 9th International Symposium on Eye Tracking Research and Applications (ETRA’16), pages 229-232, 2016.

Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze pdf
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang and Hans Gellersen
Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (UIST’15), Charlotte, NC, USA, pages 373-383, 2015. (Honorable Mention/Best Paper Nominee) Acceptance rate: 24%

The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction pdf
Yanxia Zhang, Sophie Stellmach, Abigail Sellen, and Andrew Blake
Proceedings of the 15th IFIP TC.13 International Conference on Human-Computer Interaction (INTERACT’15), pages 570-577, 2015. Acceptance rate: 27%


GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze pdf
Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling and Hans Gellersen
Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp’14), Seattle, Washington, USA, pages 559-563, 2014. Acceptance rate: 20.7%


Pupil-Canthi-Ratio: A Calibration-Free Method for Tracking Horizontal Gaze Direction pdf
Yanxia Zhang, Andreas Bulling and Hans Gellersen
Proceedings of the 12th International Working Conference on Advanced Visual Interfaces (AVI’14), Como, Italy, pages 129-132, 2014. Acceptance rate: 29%

SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays pdf
Yanxia Zhang, Andreas Bulling and Hans Gellersen
Proceedings of the 31st SIGCHI International Conference on Human Factors in Computing Systems (CHI’13), Paris, France, pages 851-860, 2013. Acceptance rate: 20%


Press Coverage: BBC News, NewScientist, Forbes, Phys.org


Towards Pervasive Gaze Tracking with Low-level Image Features pdf
Yanxia Zhang, Andreas Bulling and Hans Gellersen
Proceedings of the 7th International Symposium on Eye Tracking Research and Applications (ETRA’12), Santa Barbara, United States, pages 261-264, March 2012.
Source code: DikablisSocket

PhD Thesis

Eye Tracking and Gaze Interface Design for Pervasive Displays
Yanxia Zhang, Lancaster University, 2015

Workshop Papers

The 5th International Workshop on Pervasive Eye Tracking and Mobile Eye-based Interaction
Peter Kiefer, Yanxia Zhang and Andreas Bulling
Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp/ISWC’15 Adjunct), Osaka, Japan, pages 825-828, 2015.

Exploring Concept Drift using Interactive Simulations
Jeremiah Smith, Naranker Dulay, Mate Atilla Toth, Oliver Amft and Yanxia Zhang
Proceedings of the 2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), San Diego, CA, pages 49-54, 2013.


Discrimination of Gaze Directions Using Low-Level Eye Image Feature

Yanxia Zhang, Andreas Bulling and Hans Gellersen
Proceedings of the 1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI’11), Beijing, China, pages 9-13, September 2011.

Demos

A Collaborative Gaze Aware Information Display
Ken Pfeuffer, Yanxia Zhang, and Hans Gellersen
The 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), Osaka, Japan, September 2015.

SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays
Yanxia Zhang, Andreas Bulling and Hans Gellersen
The 31st SIGCHI International Conference on Human Factors in Computing Systems (CHI 2013), Paris, France, 2013.

Pervasive Gaze-Based Interaction with Public Display using a Webcam
Yanxia Zhang, Andreas Bulling and Hans Gellersen
The 10th International Conference on Pervasive Computing (Pervasive 2012), Newcastle, United Kingdom, June 2012.

Software

My GitHub

DikablisSocket: A software to send eye gaze data collected from a Dikablis eye tracker (from Ergoneers GmbH) to a local or global network (via TCP or UDP). The eye gaze data can be fused with measurements from other sensors, for example, using the Context Recognition Network (CRN) Toolbox.

CV

Education

Ph.D. in Computer Science, Lancaster University, UK (2010 – 2015)
Master of Science in Artificial Intelligence, University of Amsterdam, Netherlands (2008 – 2010)
Bachelor of Science in Electronics and Information Engineering, Huazhong University of Science and Technology, China (2004 – 2008)

Teaching

Co-instructor, Algorithms and Data Structure, Delft University of Technology (2017)
Co-instructor, Multimedia Analyse, Delft University of Technology (2017)
Teaching Assistant, Media Coding & Processing, Lancaster University (2015)

Selected Awards

Marie Curie Fellowship, European Commission (2011 – 2014)
Runner-up of Best PhD Talks at the SciTech Christmas Conference, Lancaster University, UK (2014)
Best Presentation at the Symposium on Natural User Interfaces, Augmented Reality and Beyond, The Rank Prize Funds, UK (2013)
Dean’s Award for Excellence in Postgraduate, Lancaster University, UK (2013)
Huygens Scholarship Program, Nuffic, Netherlands (2008 – 2010)
Excellent B.Sc. Dissertation of Hubei Province, China (2008)

Skills

Python, C/C++
OpenCV, Qt, Kinect SDK, Tobii SDK
Matlab, R, Visualization Toolkit (VTK), scikit-learn, Keras