labvanced logoLabVanced
  • Research
    • Publications
    • Researcher Interviews
    • Use Cases
      • Developmental Psychology
      • Linguistics
      • Clinical & Digital Health
      • Educational Psychology
      • Cognitive & Neuro
      • Social & Personality
      • Arts Research
      • Sports & Movement
      • Marketing & Consumer Behavior
      • Economics
      • HCI / UX
      • Commercial / Industry Use
    • Labvanced Blog
    • Services
  • Technology
    • Feature Overview
    • Code-Free Study Building
    • Eye Tracking
    • Mouse Tracking
    • Generative AI Integration
    • Multi User Studies
    • More ...
      • Reaction Time/Precise Timing
      • Text Transcription
      • Heart Rate Detection (rPPG)
      • Questionnaires/Surveys
      • Experimental Control
      • Data Privacy & Security
      • Desktop App
      • Mobile App
  • Learn
    • Guide
    • Videos
    • Walkthroughs
    • FAQ
    • Release Notes
    • Documents
    • Classroom
  • Experiments
    • Cognitive Tests
    • Sample Studies
    • Public Experiment Library
  • Pricing
    • Licenses
    • Top-Up Recordings
    • Subject Recruitment
    • Study Building
    • Dedicated Support
    • Checkout
  • About
    • About Us
    • Contact
    • Downloads
    • Careers
    • Impressum
    • Disclaimer
    • Privacy & Security
    • Terms & Conditions
  • Appgo to app icon
  • Logingo to app icon
Research
Publications
Tasks
Researcher Interviews
Use Cases
Labvanced Blog
Services
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • 日本語
Publications
Tasks
Researcher Interviews
Use Cases
Labvanced Blog
Services
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • 日本語
  • Publications
  • Tasks
  • Researcher Interviews
    • Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infant Audio-Visual Synchrony Perception
    • Finding Goldilocks Influencers- How Follower Count Drives Social Media Engagement
    • The semantic interference in 9- to 36-month old infants: An at-home eye-tracking study on infants' lexical abilities
    • Song Is More Memorable Than Speech Prosody - Discrete Pitches Aid Auditory Working Memory
    • Orthographic Relatedness and Transposed-word Effects in the Grammatical Decision Task
    • Emotion Modulation through Music after Sadness Induction
    • Children Learning Non-Adjacent Dependencies
    • Personality Hear in Noise
    • Are All Eyes the Same?
    • Verbal Priming in Infants
  • Use Cases
    • Research Areas

      • Developmental Psychology
      • Linguistics
      • Clinical & Digital Health
      • Educational Psychology
      • Cognitive & Neuro
      • Social & Personality Psychology
      • Arts, Music & Digital Media
      • Sports & Movement Psychology
      • Marketing & Consumer Behavior
      • Economics
      • HCI / UX
      • Commercial / Industry Use
    • Researchers

      • Students
      • Researchers
      • Groups
  • Blog
    • Generative AI in Educational Psychology Research
    • Best Practices for Webcam Eye Tracking Research with Infants and Toddlers
    • Types of Stimuli in Delay Discounting Tasks
    • Delay Discounting Task
    • Mindfulness-based Self-Efficacy Scale - Revised (MSES-R)
    • Team Dynamics and Research
    • Stroop Task | History, Task Description, Data and Psychology
    • BKB Sentence Test | Procedure & Research
    • Dyadic Effect in Psychology | Overview & Research
    • Navon Task: Task Setup, Research & More
    • Peer-reviewed Webcam Eye Tracking
    • Corsi Block-Tapping Test: From cubes to online design
    • Dot-Probe Task | Full Guide
    • Types of Memory: Concepts to Research
    • Ultimatum Game
    • The Visual Search Task
    • Attention Tasks in Psychology Research
    • Generalized Anxiety Disorder Scale-7 (GAD-7)
    • Decision Making Tasks in Psychology
    • The Obsessive Compulsive Inventory – Revised (OCI-R)
    • Assessing Executive Function Skills | Tasks & Batteries
    • The Flourishing Scale (FS) Questionnaire
    • Labvanced and the Spirit of Open Science
    • The Psychology of the Incubation Effect
    • The Bouba-Kiki Effect and Task
    • Lexical Decision Task: Accessing the Mental Lexicon
    • Image Description Task and Game with a Chatbox
    • Smooth Data Collection Process | 6 Tips for Research
    • Music Research with Labvanced
    • 7 Classic Cognitive Tasks & Examples
    • Mental Rotation Test | A Spatial Processing Task
    • XY Coordinates in Labvanced
    • 5 Famous Social Psychology Experiments
    • The Power of Remote & Infant-friendly ET
    • The Wisconsin Card Sorting Test
    • 13 Head Tracking Use Cases for Research
    • 5 Tips for Improving Your Perception Skills
    • Introduction to the Preferential Looking Paradigm
    • Researching Cognition in Migraine & Headache Patients
    • The Landing Page - More Info Than You Think!
    • Headphone Checks - Then and Now
    • Sample Studies - Helpful Templates and Demos!
    • What Is Neuroplasticity?
    • 15 Famous Developmental Theories
    • Visual Attention and Eye Tracking
    • What Is Eye Tracking Technology?
    • Eye Tracking in Applied Linguistics Research
    • 10 Popular Linguistic Experiments
    • The Placebo Effect
    • 6 Key Concepts of Experimental Design
    • Conditioned Play Audiometry
    • Ebbinghaus Illusion
  • Services
    • Overview
    • Joint Grant Applications
    • Subject Recruitment
    • Study Building
    • Dedicated Support
    • License Comparison
Preferential Looking Task preview for online and in-lab use.

Preferential Looking Test

The Preferential Looking Test measures visual preference by tracking where participants direct their gaze when presented with competing stimuli. Participants view multiple stimuli simultaneously, and their looking behavior is used to infer attention and choice. It is widely used in developmental and cognitive research to study perception, learning, and attention.

Table of Contents

  • Task Format
  • Data Collected
  • Technology
  • Customization
  • Recommended Use

Task Format | Preferential Looking Test Online & In-Lab

In the Preferential Looking Test, participants are presented with two stimuli (social and non-social) displayed side by side while gaze behavior is recorded to infer visual preference. The task integrates eye tracking and head tracking for improved accuracy and data quality, along with optional webcam video recording for post-hoc analysis.

The template supports both image-based and video-based stimulus formats. The image version presents static stimuli, while the video version presents dynamic clips. Both follow the same viewing procedure and data collection pipeline.

Each session begins with a short calibration process to ensure accurate gaze tracking. An optional pre-task video call step can be included to review instructions with participants. During trials, participants view the stimuli naturally without manual responses while eye movements and webcam video are recorded. After completing the trials, a results summary is generated showing key gaze metrics.


Test it out!

Preferential Looking Task Metrics and Data Collected

The Preferential Looking Task captures a range of behavioral and multimodal measurements that reveal how visual attention and natural viewing behavior are distributed between social and non-social stimuli. Using predefined Areas of Interest (AOIs), researchers can quantify how long participants attend to specific regions of the screen.

The variables recorded enable researchers to assess gaze-based metrics such as fixation duration, fixation counts, and time to first fixation, alongside head tracking measures and synchronized video recordings. These measures help quantify attentional bias, visual engagement, and orientation behavior during passive viewing. All variables can be viewed and customized within the task’s Variables Tab.

Below are examples of variables collected in the Labvanced version of the Preferential Looking Task:

Variable NameDescription
image_positions_NSPosition of the non-social stimulus (left / right)
non_social_fix_duration_trial_meanTotal fixation duration on non-social AOI
non_social_fixations_countNumber of fixations on non-social stimulus
pitchVertical head movement (degrees)
rollHead tilt (degrees)
yawHorizontal head movement (degrees)
social_fix_duration_trial_meanTotal fixation duration on social AOI
social_fixations_countNumber of fixations on social stimulus

Data preview of gaze and behavioral measures collected using the Preferential Looking Task.

Data table showing an excerpt of individual trial-level outputs from the Preferential Looking Task, including fixation duration, fixation count, stimulus position, and head orientation metrics.


Preferential Looking Test

This study measures visual preference and attentional bias using gaze tracking. Participants view social and non-social stimuli while fixation behavior, head movement, and video recordings are captured for analysis.



Technology Driving the Preferential Looking Task for Online & In-Lab Research

Labvanced includes several technologies that make the Preferential Looking Task highly suitable for developmental, remote, and multimodal research:

  • Webcam Eye Tracking: Measure gaze direction, fixation duration, and attention shifts without specialized hardware.

  • Head Tracking Integration: Monitor participant positioning and head movement (pitch, yaw, roll) during viewing.

  • Attention Getter and Calibration Support: Include fixation cues and calibration steps to improve gaze accuracy, especially for infant-friendly designs.

  • Video Recording Object: Record participant webcam video during the task for post-hoc coding and behavioral verification.

  • Video Conference Object: Enable live interaction with participants to guide setup, calibration, and instructions.

  • Precise Stimulus Layout Control: Present side-by-side stimuli with controlled spacing, alignment, and size.

  • Web Based and Desktop Deployment: Run studies online or in controlled lab environments.

  • Remote and Longitudinal Study Support: Collect data across multiple sessions and locations using integrated tools.

A person looking at a webcam eye tracking task created with Labvanced during a flanker task

Webcam Eye Tracking

Capture gaze patterns and visual attention with built-in, code-free and peer-reviewed webcam eye-tracking.

An icon of a clock symbolizing accurate reaction times from this flanker task template

Timing Precision

Capture reaction times, task performance, and more with millisecond accuracy for time-sensitive tasks.

An image of the desktop app of Labvanced used to modify the flanker task template

Desktop App

Run in-lab studies using the Desktop App, compatible with EEG and other LSL-connected lab hardware.

Customization of the Preferential Looking Task

There are many ways to adapt this Preferential Looking Task template to meet specific research questions. Below are several customization themes researchers commonly explore when modifying this task.


Stimulus Version Selection

Participants can be assigned to image-based or video-based versions. Researchers can modify which version appears and adjust instructions in the task editor.


Stimulus Content and Presentation

Side-by-side stimuli can be replaced directly using Image Object or Video Object. Properties such as size, position, and appearance can be adjusted in the Object Properties panel.


Areas of Interest and Gaze Metrics

Viewing regions are defined as Areas of Interest (AOIs). Eye tracking Events reference these regions to calculate fixation counts and durations.


Video Recording and Conference Flow

The Video Recording Object and Video Conference Object can be enabled, disabled, or adjusted to control recording and interaction stages.


Calibration and Trial Timing

Calibration procedures, viewing durations, and transitions can be modified through frame timing and Events.


If you need help customizing this task, please feel welcome to write to us and ask:


Contact Us

Recommended Use and Applications of the Preferential Looking Task

The Preferential Looking Task is widely used across research domains investigating perception, learning, and attention.

  • Developmental Research: Used to study perception and learning in infants and young children through gaze behavior.

  • Language and Category Learning Studies: Examines how participants associate visual stimuli with sounds or categories.

  • Attention and Visual Preference Research: Investigates how visual features guide attention and preference.

  • Clinical and Neurodevelopmental Research: Used to study atypical gaze patterns in developmental conditions.

  • Social and Emotional Processing Research: Examines preference for faces, expressions, and social cues.


Sign Up


References

  • Teller, D. Y. (1979). The forced choice preferential looking technique for use with human infants. Infant Behavior and Development, 2(2), 135–153.

  • Dubey, I., Brett, S., Ruta, L., Bishain, R., Chandran, S., Bhavnani, S., Belmonte, M. K., Estrin, G. L., Johnson, M., Gliga, T., Chakrabarti, B., & START consortium (2022). Quantifying preference for social stimuli in young children using two tasks on a mobile platform. PLoS One, 17(6), e0265587.