Webcam Eye Tracking by Labvanced
Labvanced’s webcam eye-tracking technology delivers scalable, hardware-free, and unparalleled accuracy, ideal for both in-lab or remote studies. It enables researchers to collect consistent, high-quality gaze measurements across diverse participant populations, making it a practical and powerful choice for a wide range of research designs.
Table of Contents
Labvanced Webcam Eye Tracking: The New Gold Standard for Remote Eye Tracking
Labvanced delivers research-grade eye tracking available to anyone with a standard webcam. With increased accessibility and full integration into the experimental workflow, researchers conducting in-lab and remote studies can easily utilize highly accurate, peer-reviewed webcam eye tracking.
| Property | Specifications |
|---|---|
| Architecture | Pre-trained CNN for gaze estimation |
| Accuracy | 1.3 visual degrees (1.4 cm) |
| Precision | 1.1 visual degrees (1.2 cm) |
| Sampling Rate | 30 Hz to 60 Hz (depending on webcam's frame rate) |
| Fixations | Built-in fixation detection algorithm |
| Calibration | Adjustable length (30 sec. - 5 min.+) and properties (infant-friendly, virtual chinerest constrains, etc.) |
| Minimum Resolution | 1280x720px (HD-ready webcam) |
| Device Requirements | Device agnostic: Works on desktops, tablets, and phones; no further specific requirements |
| Data Output | Gaze, Fixations, Dwell Time, Time to First Fixation, Fixation Duration, Number Fixations, Custom Metrics |
| Privacy | Client-side processing (directly on the participant’s device), meaning no facial data is ever transmitted |
4 Key Functionalities for Successful Webcam Eye Tracking
To run successful webcam eye tracking studies, four principles are crucial: Accuracy ensures reliable data despite hardware variability; Adaptability and Control allow researchers to tailor protocols and performance thresholds to the specific compliance levels of diverse participants; and uncompromising Privacy is non-negotiable for ethical remote data capture and participant trust.
Accuracy
Utilizing proprietary, deeply trained neural networks, Labvanced achieves robust gaze estimation reliability, demonstrating 1.3 visual degrees of accuracy.
Privacy
Through automatic client-side processing, all sensitive visual data is calculated locally, guaranteeing that no sensitive participant information leaves their device.
Adaptability
The system features adjustable settings and flexible configurations specifically designed to accommodate the unique needs and compliance challenges of populations ranging from infants to older adults.
Control
Gain full authority over the experimental process, like defining minimum gaze performance requirements, specifying calibration processes, and setting precise administrative rules across every task and trial.
Data Collected
Labvanced offers a wide range of webcam-based eye-tracking outputs, making it easy to capture exactly the level of detail a study requires. From precise gaze coordinates to rich fixation metrics and trial-level diagnostics, the platform provides comprehensive data for understanding how participants engage with visual stimuli.
Gaze Position
Gaze position coordinates (X and Y) can be captured throughout the task and can further be tailored such that data only records specifically for AOIs or regions of interest. This is then paired with additional data points like the time stamp (T) and the confidence interval (C) which is a measure of how open the eye was during the measurement.
![]()
X: X-coordinate of gaze; Y: Y-coordinate of gaze; T: Timestamp; C: Confidence Interval
Fixations
For studies that want to collect data beyond gaze coordinates, fixation-based data collection is also an option where key metrics include:
- Number of fixations
- Time to first fixation
- Fixation duration
More Trial-Specific Gaze Data
Labvanced’s webcam-based eye tracking provides rich, trial-specific data that goes far beyond gaze coordinates. For each trial, researchers can access data like the last cached gaze position or last cached fixation, as well diagnostic metrics such as calibration errors. This added layer of granularity makes it easier to evaluate data quality and interpret participants’ viewing behavior with greater confidence, as well as use this information to further control the behavior of the experiment.
The data collected can then be exported in a ready-to-analyze format according to your specifications, including CSV, XLSX, TSV.
Fine-Grained Calibration Options
The calibration procedure is closely intertwined with data quality and output. Labvanced gives you the options to specify exactly when and how calibration should run.
![]()
- Eye Tracking Calibration is the only part where researchers are required to actively engage with the eye tracking system.
- As requirements differ for each study, a lot of settings are adjustable for the calibration procedure.
- There is a main tradeoff between length of calibration and accuracy, where longer calibration procedures will lead to more accurate data outputs.
- Other adjustments such as the strength of the Virtual Chinrest or Re-calibration settings are also useful to consider.
Benchmarking Accuracy
Labvanced’s webcam eye tracking is 2-3 times as accurate as its next best competitor and over 80% of all eye tracking research will be possible with our system. Labvanced offers the only webcam eye tracking solution with a research grade (peer-reviewed) scientific publication.
Validation via Comparison to Hardware-based Eye Tracking
- Our webcam-based eye-tracking system has a groundbreaking accuracy of about 1.2 to 1.4 visual degrees.
- Our comparison study with an EyeLink 1000 hardware eye tracking system shows a Pearson correlation of larger 0.9.
- Visual analysis of natural scenes visually confirms how close Labvanced’s prediction is to the ground truth as compared to EyeLink.
- Labvanced’s webcam-based eye tracking accuracy was only about 0.5° lower than the EyeLink 1000 system.
- Accuracy increased even further for stimuli presented in the center of the screen.
- The accuracy was consistent across time and did not drop off over time.
- The only other popular webcam-based system (WebGazer - Papoutsaki, 2015) has a reported accuracy of about 4 visual degrees, illustrating that our system is about 3 times more accurate than anything else.
Comparison between Labvanced Webcam Eye Tracking and WebGazer
Typically, webcam eye tracking has become synonymous with WebGazer. Labvanced’s eye tracking belongs to its own category by using custom neural networks in order to estimate gaze coordings. A quick look at the table below demonstrates just how differently these two eye-tracking tools perform:
| Metric | WebGazer | Labvanced |
|---|---|---|
| Accuracy | 4.17 visual degrees | 1.3 visual degrees |
| Main Technology | Ridge Regression | Pre-Trained CNN |
| Release Year | 2016 | 2022 |
| Publication | IJCAI (Conference paper) | BRM (Journal paper) |
| Fixation Detector | No / too noisy | Yes, it’s built-in |
Scientific Impact: Labvanced in Peer-reviewed Literature
How are others using Labvanced’s webcam eye tracking in their research? Here are a few examples:
In this randomized controlled trial study, the researchers leveraged Labvanced’s webcam-based eye tracking to determine how pharmacists’ visual attention shifts when interacting with AI decision support. By remotely capturing gaze during a medication verification task, the researchers revealed how AI advice—even when uncertain—reshapes cognitive processing.
Tsai, C. C., et al. (2025). Effect of artificial intelligence helpfulness and uncertainty on cognitive interactions with pharmacists: Randomized controlled trial. Journal of Medical Internet Research, 27, e59946.
In this study, the authors used Labvanced’s webcam‑based eye‑tracking technology to collect gaze data as participants viewed images of urban environments. By tracking participants’ looking behaviors at different urban scene features, the researchers could relate visual attention to emotional appraisal of built versus natural elements. The use of Labvanced allowed this to be done online and at scale, capturing fine-grained eye movement metrics without requiring specialized in‑lab hardware.
Sander, I., et al. (2024). Beyond built density: From coarse to fine-grained analyses of emotional experiences in urban environments. Journal of Environmental Psychology, 96, 102337.
In this multimethod study, the authors used Labvanced’s webcam-based eye‑tracking to measure how participants visually engage with influencer content. By tracking gaze behavior while participants viewed Instagram profiles with different influencer follower counts (as the region of interest), they were able to assess how attention patterns relate to perceived connection and engagement.
Wies, S., Bleier, A., & Edeling, A. (2023). Finding goldilocks influencers: How follower count drives social media engagement. Journal of Marketing, 87(3), 383-405.
Interactive Demos
Live Metrics
In this demo of a simple looking task, several eyentracking functions are shown for two pictures serving as areas of interest (AoIs): dwell time, time to first fixation, number of fixations, and average fixation duration. Note: the study begins with a 5 minute calibration (medium-loose chinrest constraint) which you can further edit upon importing.
Preferential Looking Paradigm
This study investigates visual attention in infants using a preferential looking paradigm and can be further used as a template for such research. Participants view either images or videos side-by-side, while gaze patterns, fixation durations and head orientation are recorded, as well as a video recording of the participant. Note: uses infant-friendly calibration.
AOI Detection
Using polygon shapes to mask over certain facial features, fixations are counted and gaze coordinates recorded. See how events are set up to record eye tracking data as well as how objects are used to define unique AOIs.
Head Tracking
In this demo, you can try different head positions to move the sliders on the screen which serve as visualizations of numeric values. Head tracking is typically used as either a compliment or alternative to eyetracking.
Data Privacy and Webcam Eye Tracking 🔐
At Labvanced, we take data privacy and security extremely seriously. This can best be shown by the following points:
- All processing of face and eye data happens on the user’s device, ensuring true privacy by design for eye data.
- Only gaze position, and fixation durations (numeric data which is completely anonymous) will be saved to our server.
- An external security audit (VAPT) has been conducted and passed successfully. Documents can be provided on request.
- Our servers belong to us and are solely located within the European Union / under European jurisdiction.
- We provide you with our Technical and Organizational Measures (ToM), and data processing agreements (DPAs) on request.
- All data can be end-to-end encrypted (PGP), such that the data will only be stored in an encrypted format on our server.
Documentation
Integrating Webcam Eye Tracking into Your Labvanced Task
A step-by-step overview of enabling eye tracking and designing tasks that reliably capture gaze data in Labvanced.
Participant Experience in Webcam Eye Tracking Studies
An detailed overview of what participants see during calibration and recording, helping you design smoother remote study experiences.
Best Practices for Webcam Eye Tracking Research with Infants & Toddlers
Implementation tips to keep in mind when using webcam eye tracking for research with infants and toddlers.
Feasibility Check
If you need more support or are curious about a particular webcam eye tracking feature, please reach out to us via chat or email. We are more than happy to hear about your research and walk you through any questions about the platform and discuss project feasibility.