labvanced logoLabVanced
  • Research
    • Publications
    • Researcher Interviews
    • Use Cases
      • Behavioral Psychology
      • Personality & Social Psychology
      • Cognitive & Neuro Psychology
      • Developmental & Educational Psychology
      • Clinical & Health Psychology
      • Sports & Movement Psychology
      • Marketing & Consumer Psychology
    • Labvanced Blog
  • Technology
    • Feature Overview
    • Desktop App
    • Phone App
    • Precise Timing
    • Experimental Control
    • Eye Tracking
    • Multi User Studies
    • More ...
      • Questionnaires
      • Artificial Intelligence (AI) Integration
      • Mouse Tracking
      • Data Privacy & Security
  • Learn
    • Guide
    • Videos
    • Walkthroughs
    • FAQ
    • Release Notes
    • Documents
    • Classroom
  • Experiments
    • Public Experiment Library
    • Labvanced Sample Studies
  • Pricing
    • Pricing Overview
    • License Configurator
    • Single License
    • Research Group
    • Departments & Consortia
  • About
    • About Us
    • Contact
    • Downloads
    • Careers
    • Impressum
    • Disclaimer
    • Privacy & Security
    • Terms & Conditions
  • Appgo to app icon
  • Logingo to app icon
Learn
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • Guide
    • GETTING STARTED

      • Objects
      • Events
      • Variables
      • Task Wizard
      • Trial System
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
    • FEATURED TOPICS

      • Randomization & Balance
      • Eye Tracking
      • Desktop App
      • Sample Studies
      • Participant Recruitment
      • API Access
        • REST API
        • Webhook API
        • WebSocket API
      • Other Topics

        • Precise Stimulus Timings
        • Multi User Studies
        • Head Tracking in Labvanced | Guide
    • MAIN APP TABS

      • Overview: Main Tabs
      • Dashboard
      • My Studies
      • Shared Studies
      • My Files
      • Experiment Library
      • My Account
      • My License
    • STUDY TABS

      • Overview: Study-Specific Tabs
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
      • Task Editor
        • Main Functions and Settings
        • The Trial System
        • Canvas and Page Frames
        • Objects
        • Object Property Tables
        • Variables
        • System Variables Tables
        • The Event System
        • Trial Randomization
        • Text Editor Functions
        • Eyetracking in a Task
        • Head Tracking in a Task
        • Multi-User Studies
      • Study Settings
        • Start Up and Main Settings
        • Browsers & Devices Settings
        • Experiment Features Settings
      • Description
        • More Details about Description Information
        • Images, Links, and References in Descriptions
      • Variables
      • Media
      • Translate
      • Run
      • Publish and Record
        • Requirements for Publishing a Study in Labvanced
        • Recruiting Participants and Crowdsourcing
        • License Selection and Confirmation
        • After Publishing Your Labvanced Study
      • Sharing
      • Participants
      • Dataview and Export
        • Dataview and Variable & Task Selection (OLD Version)
        • Accessing Recordings (OLD Version)
  • Videos
    • Video Overview
    • Getting Started in Labvanced
    • Creating Tasks
    • Element Videos
    • Events & Variables
    • Advanced Topics
  • Walkthroughs
    • Introduction
    • Stroop Task
    • Lexical Decision Task
    • Posner Gaze Cueing Task
    • Change Blindness Flicker Paradigm
    • Eye-tracking Sample Study
    • Infant Eye-tracking Study
    • Attentional Capture Study with Mouse Tracking
    • Rapid Serial Visual Presentation
    • ChatGPT Study
    • Eye Tracking Demo: SVGs as AOIs
    • Multi-User Demo: Show Subjects' Cursors
    • Gamepad / Joystick Controller- Basic Set Up
    • Desktop App Study with EEG Integration
  • FAQ
    • Features
    • Security & Data Privacy
    • Licensing
    • Precision of Labvanced
    • Programmatic Use & API
    • Using Labvanced Offline
    • Troubleshooting
    • Study Creation Questions
  • Release Notes
  • Classroom

Creating an Eyetracking Task

Once eyetracking is enabled in the Study Settings, you can add a new task to the experiment and begin setting up the experimental content.

The overall background color of the frames in your study influences eyetracking accuracy. If your frames are a dark color, the subject's eyes are not illuminated as well as if the frames are white or lightly colored. This brightness is very helpful for tracking subtle eye movements.

Activating Eyetracking for a Task

In the top left corner of the editor, click on “Edit” next to Phys. Signals to open the dialog box. Click the box next to “Enable eye tracking in this task” and more options will appear.

It is important to note that eyetracking can be enabled for multiple tasks in the experiment. However, the main calibration (specified in the Study Settings) will only occur one time before the start of the first task in the study where eyetracking is enabled.

The head pose/Virtual Chinrest that was specified during the main calibration will be checked briefly before every trial where eyetracking is enabled. You can specify how many calibration points you want to be used to recalibrate the system in between trials. Setting this to zero disables the recalibration.

Checking the box next to “drift correction” counters a subject’s potential natural tendency to shift towards one direction over another and reduces errors in tracking. In the box below this option, you can specify how many points/head poses should be used for drift correction. The default number of trials is 6, but this is a moving average that factors in previous recalibrations.

Click “Ok” to save your settings.

Display Settings

The default display mode is Zoom/Adaptive, which scales your experimental content to fit on every screen. For eyetracking experiments, it can be useful to change this mode to Fixed in Millimeters or Fixed in Visual Degrees. This is helpful because the system will record how much the subject moved their eyes in visual degrees or millimeters instead of frame units, so measurements can be more precise.

If you change the display mode to Millimeters or Visual Degrees, you should go back to the Study Settings tab and observe that in the middle column (Browsers & Devices), there are new options in the Allowed Screen Size & Resolution section. More precise specifications can be made regarding the allowable size of participant’s screens in millimeters or in visual degrees.

Because eyetracking has been activated, the option “show screen calibration screen to infer the physical size” is already selected and mandatory. Participants are asked to hold a standard plastic card (such as an ID or Metro card) up to their screen to determine the size of the screen. This is accurate because the size of these kinds of cards is standard throughout the world.

Event Setup

Recording Eyetracking

  • Add a new Event from the Events editor.
  • For Trigger, select Physiological Signals  Eyetracking. There is an option on this screen to only trigger the action when the subject looks at certain elements, which can be added as targets.
  • The Action should be a Variable Action  Set/Record Variable.
    • Click the green Select box and add a new variable.
      • Choose Array as the format (recommended).
      • Choose Numeric as the data type.
      • Make sure the boxes for “reset at trial start,” “record variable,” and “view in global list” are all checked/enabled.
      • Under “Record Type,” click the option for All changes/time series. This is important to record multiple values per trial instead of only the final value of a trial.
    • Click the pen icon and select Trigger (Eyetracking). A dropdown menu of options will appear. We recommend selecting “Coord.+Time+Confidence[X,Y,T,C]Array” to record the X/Y coordinates of the eye gaze, the precise time stamp, and a confidence value of the measurement. This option should give you all of the necessary data.

Recording Error

  • Add another Event if you wish to record the size of the error on a frame in frame units.
  • Select “On Frame Start” as the Trigger.
  • Select Variable Action  Set/Record Variable as the Action.
    • Click the green Select box and add your variable.
    • Click the pen icon and scroll down to “Frame/Task/Object”  Eyetracking  Error Trial
  • Repeat this process but instead select Error Calibration in the final step to record the error of the main calibration. This will tell you how well the calibration worked for each subject.