labvanced logoLabVanced
  • Research
    • Publications
    • Researcher Interviews
    • Use Cases
      • Behavioral Psychology
      • Personality & Social Psychology
      • Cognitive & Neuro Psychology
      • Developmental & Educational Psychology
      • Clinical & Health Psychology
      • Sports & Movement Psychology
      • Marketing & Consumer Psychology
    • Labvanced Blog
  • Technology
    • Feature Overview
    • Desktop App
    • Phone App
    • Precise Timing
    • Experimental Control
    • Eye Tracking
    • Multi User Studies
    • More ...
      • Questionnaires
      • Artificial Intelligence (AI) Integration
      • Mouse Tracking
      • Data Privacy & Security
  • Learn
    • Guide
    • Videos
    • Walkthroughs
    • FAQ
    • Release Notes
    • Documents
    • Classroom
  • Experiments
    • Public Experiment Library
    • Labvanced Sample Studies
  • Pricing
    • Pricing Overview
    • License Configurator
    • Single License
    • Research Group
    • Departments & Consortia
  • About
    • About Us
    • Contact
    • Downloads
    • Careers
    • Impressum
    • Disclaimer
    • Privacy & Security
    • Terms & Conditions
  • Appgo to app icon
  • Logingo to app icon
Learn
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • Guide
    • GETTING STARTED

      • Objects
      • Events
      • Variables
      • Task Wizard
      • Trial System
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
    • FEATURED TOPICS

      • Randomization & Balance
      • Eye Tracking
      • Desktop App
      • Sample Studies
      • Participant Recruitment
      • API Access
        • REST API
        • Webhook API
        • WebSocket API
      • Other Topics

        • Precise Stimulus Timings
        • Multi User Studies
        • Head Tracking in Labvanced | Guide
    • MAIN APP TABS

      • Overview: Main Tabs
      • Dashboard
      • My Studies
      • Shared Studies
      • My Files
      • Experiment Library
      • My Account
      • My License
    • STUDY TABS

      • Overview: Study-Specific Tabs
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
      • Task Editor
        • Main Functions and Settings
        • The Trial System
        • Canvas and Page Frames
        • Objects
        • Object Property Tables
        • Variables
        • System Variables Tables
        • The Event System
        • Trial Randomization
        • Text Editor Functions
        • Eyetracking in a Task
        • Head Tracking in a Task
        • Multi-User Studies
      • Study Settings
        • Start Up and Main Settings
        • Browsers & Devices Settings
        • Experiment Features Settings
      • Description
        • More Details about Description Information
        • Images, Links, and References in Descriptions
      • Variables
      • Media
      • Translate
      • Run
      • Publish and Record
        • Requirements for Publishing a Study in Labvanced
        • Recruiting Participants and Crowdsourcing
        • License Selection and Confirmation
        • After Publishing Your Labvanced Study
      • Sharing
      • Participants
      • Dataview and Export
        • Dataview and Variable & Task Selection (OLD Version)
        • Accessing Recordings (OLD Version)
  • Videos
    • Video Overview
    • Getting Started in Labvanced
    • Creating Tasks
    • Element Videos
    • Events & Variables
    • Advanced Topics
  • Walkthroughs
    • Introduction
    • Stroop Task
    • Lexical Decision Task
    • Posner Gaze Cueing Task
    • Change Blindness Flicker Paradigm
    • Eye-tracking Sample Study
    • Infant Eye-tracking Study
    • Attentional Capture Study with Mouse Tracking
    • Rapid Serial Visual Presentation
    • ChatGPT Study
    • Eye Tracking Demo: SVGs as AOIs
    • Multi-User Demo: Show Subjects' Cursors
    • Gamepad / Joystick Controller- Basic Set Up
    • Desktop App Study with EEG Integration
  • FAQ
    • Features
    • Security & Data Privacy
    • Licensing
    • Precision of Labvanced
    • Programmatic Use & API
    • Using Labvanced Offline
    • Troubleshooting
    • Study Creation Questions
  • Release Notes
  • Classroom

Head Tracking

Introduction

Head tracking is another feature popular amongst Labvanced users who study attention. Tracking head movements can be used for a wide range of applications, for some ideas, check out this most on the 13 Use Cases of Head Tracking.

To Head Track or to Eye Track? Which to use?

Head tracking can either be used on its own as the main physiological measurement in an experiment or it can be used in conjunction with eye tracking as an additional stream of data input. The answer to the question as to whether research should employ head tracking or eye tracking largely depends on the research question and the study population. Very often, researchers will use both and in instances where the participants cannot stand still for prolonged periods of time during the calibration phase of eye tracking, only head tracking will be used.

Creating a Task

Enabling head tracking in your online study is easy and straightforward. You simply activate the option for physiological signals in the Task Editor and select the head tracking option. For more information about the parameters that are available for recording data using our head tracking feature - visit the page called Head Tracking in a Task.

Running the Study

When the experimental study begins, the participant is shown a preview of themselves using the webcam they have on their device. The video recording has a mesh created by the neural network and the participant is simply asked to confirm that it looks correct. Once they do so, the experiment begins as planned.

Data Output

When data is captured using head tracking, it is important to note that all recorded values are relative to the camera. Based on the location of the camera, there is a line of symmetry and any movement that happens relative to the camera is recorded as a numeric value. Thus a recorded value of ‘0’ means that it is relative to the camera and indicates that the participant is positioned straight on and directly in front of the camera. Any movement that occurs is then recorded as a positive or negative value depending on the direction.

The numeric values that are captured are determined by the variables selected during experimental design. The possibilities include capturing time stamps, coordinates (X,Y,Z arrays), and vectors. These options and parameters are explained in greater detail on the page dedicated to explaining how to set up Head Tracking in a Task.

  • Note: The faster that a participant’s computer is (ie. high CPU/GPU) the higher the sampling rate is, thus leading to more data points, an important principle that is also relevant for physiological measurements captured by our webcam-based eye tracking.

Sample Demo

In this head tracking sample demo you can see what a head tracking study looks like when it loads, how participants are asked to confirm that the mesh works well before the study initializes. Also, the sliders that appear represent numeric values that are captured based on head movements and positioning relative to the camera.

Prev
Multi User Studies