labvanced logoLabVanced
  • Research
    • Publications
    • Researcher Interviews
    • Use Cases
      • Developmental Psychology
      • Linguistics
      • Clinical & Digital Health
      • Educational Psychology
      • Cognitive & Neuro
      • Social & Personality
      • Arts Research
      • Sports & Movement
      • Marketing & Consumer Behavior
      • Economics
      • HCI / UX
      • Commercial / Industry Use
    • Labvanced Blog
    • Services
  • Technology
    • Feature Overview
    • Code-Free Study Building
    • Eye Tracking
    • Mouse Tracking
    • Generative AI Integration
    • Multi User Studies
    • More ...
      • Reaction Time/Precise Timing
      • Text Transcription
      • Heart Rate Detection (rPPG)
      • Questionnaires/Surveys
      • Experimental Control
      • Data Privacy & Security
      • Desktop App
      • Mobile App
  • Learn
    • Guide
    • Videos
    • Walkthroughs
    • FAQ
    • Release Notes
    • Documents
    • Classroom
  • Experiments
    • Cognitive Tests
    • Sample Studies
    • Public Experiment Library
  • Pricing
    • Licenses
    • Top-Up Recordings
    • Subject Recruitment
    • Study Building
    • Dedicated Support
    • Checkout
  • About
    • About Us
    • Contact
    • Downloads
    • Careers
    • Impressum
    • Disclaimer
    • Privacy & Security
    • Terms & Conditions
  • Appgo to app icon
  • Logingo to app icon
Research
Publications
Tasks
Researcher Interviews
Use Cases
Labvanced Blog
Services
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • 日本語
Publications
Tasks
Researcher Interviews
Use Cases
Labvanced Blog
Services
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • 日本語
  • Publications
  • Tasks
  • Researcher Interviews
    • Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infant Audio-Visual Synchrony Perception
    • Finding Goldilocks Influencers- How Follower Count Drives Social Media Engagement
    • The semantic interference in 9- to 36-month old infants: An at-home eye-tracking study on infants' lexical abilities
    • Song Is More Memorable Than Speech Prosody - Discrete Pitches Aid Auditory Working Memory
    • Orthographic Relatedness and Transposed-word Effects in the Grammatical Decision Task
    • Emotion Modulation through Music after Sadness Induction
    • Children Learning Non-Adjacent Dependencies
    • Personality Hear in Noise
    • Are All Eyes the Same?
    • Verbal Priming in Infants
  • Use Cases
    • Research Areas

      • Developmental Psychology
      • Linguistics
      • Clinical & Digital Health
      • Educational Psychology
      • Cognitive & Neuro
      • Social & Personality Psychology
      • Arts, Music & Digital Media
      • Sports & Movement Psychology
      • Marketing & Consumer Behavior
      • Economics
      • HCI / UX
      • Commercial / Industry Use
    • Researchers

      • Students
      • Researchers
      • Groups
  • Blog
    • Generative AI in Educational Psychology Research
    • Best Practices for Webcam Eye Tracking Research with Infants and Toddlers
    • Types of Stimuli in Delay Discounting Tasks
    • Delay Discounting Task
    • Mindfulness-based Self-Efficacy Scale - Revised (MSES-R)
    • Team Dynamics and Research
    • Stroop Task | History, Task Description, Data and Psychology
    • BKB Sentence Test | Procedure & Research
    • Dyadic Effect in Psychology | Overview & Research
    • Navon Task: Task Setup, Research & More
    • Peer-reviewed Webcam Eye Tracking
    • Corsi Block-Tapping Test: From cubes to online design
    • Dot-Probe Task | Full Guide
    • Types of Memory: Concepts to Research
    • Ultimatum Game
    • The Visual Search Task
    • Attention Tasks in Psychology Research
    • Generalized Anxiety Disorder Scale-7 (GAD-7)
    • Decision Making Tasks in Psychology
    • The Obsessive Compulsive Inventory – Revised (OCI-R)
    • Assessing Executive Function Skills | Tasks & Batteries
    • The Flourishing Scale (FS) Questionnaire
    • Labvanced and the Spirit of Open Science
    • The Psychology of the Incubation Effect
    • The Bouba-Kiki Effect and Task
    • Lexical Decision Task: Accessing the Mental Lexicon
    • Image Description Task and Game with a Chatbox
    • Smooth Data Collection Process | 6 Tips for Research
    • Music Research with Labvanced
    • 7 Classic Cognitive Tasks & Examples
    • Mental Rotation Test | A Spatial Processing Task
    • XY Coordinates in Labvanced
    • 5 Famous Social Psychology Experiments
    • The Power of Remote & Infant-friendly ET
    • The Wisconsin Card Sorting Test
    • 13 Head Tracking Use Cases for Research
    • 5 Tips for Improving Your Perception Skills
    • Introduction to the Preferential Looking Paradigm
    • Researching Cognition in Migraine & Headache Patients
    • The Landing Page - More Info Than You Think!
    • Headphone Checks - Then and Now
    • Sample Studies - Helpful Templates and Demos!
    • What Is Neuroplasticity?
    • 15 Famous Developmental Theories
    • Visual Attention and Eye Tracking
    • What Is Eye Tracking Technology?
    • Eye Tracking in Applied Linguistics Research
    • 10 Popular Linguistic Experiments
    • The Placebo Effect
    • 6 Key Concepts of Experimental Design
    • Conditioned Play Audiometry
    • Ebbinghaus Illusion
  • Services
    • Overview
    • Joint Grant Applications
    • Subject Recruitment
    • Study Building
    • Dedicated Support
    • License Comparison
Mental 3D rotation task  preview for online and inlab use.

The Mental Rotation Task 3D

The Mental Rotation Task 3D is a spatial cognition paradigm used to measure how individuals mentally transform objects in three dimensional space. Participants evaluate rotated stimuli and decide whether they match a reference configuration despite changes in viewpoint.

Table of Contents

  • Task Format
  • Data Collected
  • Technology
  • Customization
  • Recommended Use

Task Format | Mental Rotation Task 3D Online & In-Lab

In the Mental Rotation Task (3D Body Rotation), participants view images of an abstract human figure shown from different viewpoints and rotation angles. In this task, participants view pictures with abstract depictions of a human figure that may appear either from the front view (face visible) or from the back view (no face visible). In each picture, one limb (either an arm or a leg) will be raised. The participants are expected to decide whether the raised limb is the left limb or the right limb. Each session begins with a short practice block to familiarize participants with the decision rules before starting the main experimental trials. During the practice block, figures appear in upright positions so participants can learn the left–right response rule. In the main task, the same figures are presented at different rotation angles and viewpoints, requiring mental transformation across depth and perspective changes.



Test it out!

Two versions of the Mental Rotation Task (3D Body Rotation) are available, each optimized for the type of device and input method being used:

Desktop Version

In the desktop version, a body figure appears at the center of the screen on each trial. Participants press D if the raised limb is on the left side and K if it is on the right side. Practice trials use upright figures, while rotated figures are introduced only in the main task.


Mobile Version

In the mobile version, the same task structure is used. Participants respond by tapping Left or Right buttons displayed on the screen. Practice trials use upright figures, while rotated figures are introduced only in the main task.

Mental Rotation Task 3D Metrics and Data Collected

The Mental Rotation Task 3D captures a range of behavioral measurements that reveal how individuals mentally manipulate and compare objects across three-dimensional space. The variables recorded ultimately enable researchers to assess reaction times, response accuracy, error patterns, and performance differences across rotation angles, viewing perspectives, and stimulus conditions (e.g., normal vs. mirrored objects). These measures help quantify visuospatial transformation ability, mental rotation efficiency, and decision-making under increased spatial complexity. All variables can be viewed and customized within the task’s Variables Tab.

Below are several of the most informative indicators that researchers frequently analyze in this version of the task:

Variable NameDescription
accuracyTrial-level accuracy (1 = correct, 0 = incorrect)
accuracy_totalTotal number of correct responses across trials
choiceResponse given by the participant (keypress = K/D or button click= "Left” / "Right")
errorTrial-level error (0 = correct, 1 = wrong)
raised_limbSpecifies which limb (arm or leg) is raised in the stimulus.
reaction_timeTime taken (in milliseconds) by the participant to respond after stimulus onset.
rotation_angleDegree of rotation applied to the figure during the specific trial.
sideThe raised limb for the presented stimulus (Left/ Right).
stimulus_presentedFile name of image presented as stimulus in each trial.
viewPerspective of the figure (front / back).

Data table showing an excerpt of individual trial-level outputs from the Mental Rotation Task (3D Body Rotation), including accuracy, participant choice, reaction time, rotation angle, stimulus side, raised limb, and presented image.

Data table showing an excerpt of individual trial-level outputs from the Mental Rotation Task (3D Body Rotation), including accuracy, participant choice, reaction time, rotation angle, stimulus side, raised limb, and presented image.

Mental Rotation Task (3D Body Rotation)

This study investigates how people mentally rotate images of human bodies. Participants view figures from different angles and must decide if a raised limb is right limb or left limb.



Technology Driving the Mental Rotation Task 3D for Online & In-Lab Research

The Mental Rotation Task 3D requires precise control over stimulus presentation, timing, and response handling. Labvanced provides flexible tools that support complex spatial task designs across different study environments.

  • Flexible Presentation of Three Dimensional Stimuli: Researchers can present stimuli as images or visual objects that depict different viewpoints. Orientation changes can be defined through condition logic, allowing systematic manipulation of angular disparity across trials.

  • Cross Device Input Support: Responses can be collected via keyboard input on desktop devices or through on screen buttons on touch enabled devices. The same task logic can be reused while adapting the interface to different hardware.

  • Desktop App Mode for Controlled Experiments: For studies requiring stronger control over timing or hardware integration, the task can be deployed using the Labvanced desktop app. This supports offline testing and compatibility with EEG or other LSL based systems.

  • Remote and Longitudinal Study Deployment: The task can be administered remotely and repeated across multiple sessions, making it suitable for studies examining spatial skill development or training effects over time.

  • Optional Webcam Eye Tracking Integration: Researchers can integrate webcam based eye tracking to analyze gaze behavior and perspective processing during three dimensional mental rotation trials.

A person looking at a webcam eye tracking task created with Labvanced during a flanker task

Webcam Eye Tracking

Capture gaze patterns and visual attention with built-in, code-free and peer-reviewed webcam eye-tracking.

An icon of a clock symbolizing accurate reaction times from this flanker task template

Timing Precision

Capture reaction times, task performance, and more with millisecond accuracy for time-sensitive tasks.

An image of the desktop app of Labvanced used to modify the flanker task template

Desktop App

Run in-lab studies using the Desktop App, compatible with EEG and other LSL-connected lab hardware.

Customization of the Mental Rotation Task 3D

There are many ways to adapt this Mental Rotation Task 3D template to meet specific research questions. Below are a few themes researchers commonly ask when it comes to modifying this task.


Viewpoint and Rotation Conditions

Three dimensional rotation tasks often involve different viewing angles or perspective shifts. Researchers can define multiple viewpoint conditions in the Factors & Randomization and Trials & Conditions panels to control how the stimulus orientation changes across trials.


Stimulus Presentation

Different types of 3D style stimuli can be used by replacing the visual object directly in the editor. Researchers can add images using an Image Object, text using a Text Object, or other visual elements from the Objects panel, then adjust size, position, and appearance through the Object Properties panel.


Response Mapping

Keyboard keys or touchscreen buttons can be reassigned to match different judgment types. This could be achieved through simple modifications to the relevant events. Counterbalancing of response positions can be implemented through trial conditions to reduce motor bias.


Timing and Task Flow

Frame duration, stimulus timing, and response windows can be adjusted by editing frame durations or related Events. Practice blocks and experimental blocks can also be modified separately to accommodate increased difficulty often associated with 3D mental rotation.


If there is something else you’d like to know, please feel welcome to write to us and ask:


Contact Us

Recommended Use and Applications of the Mental Rotation Task 3D

The Mental Rotation Task 3D is widely used to investigate perspective taking and spatial transformation processes that involve depth and viewpoint changes.

  • Embodied Spatial Cognition Research: Researchers use this task to study how individuals mentally transform objects in three dimensional space and shift imagined viewpoints. Reaction time patterns often increase with greater angular disparity, reflecting the cognitive cost of embodied rotation.

  • Developmental and Individual Differences Research: The task is used to examine variability in spatial visualization ability across individuals and age groups. Three dimensional rotation tasks are particularly sensitive to differences in strategy use and spatial experience.

  • Applied and Training Research: Mental Rotation Task 3D paradigms are frequently used in studies exploring spatial training, simulation based learning, and skill acquisition in technical domains. Because the task requires depth processing, it is often used to examine advanced spatial reasoning.

  • Neurocognitive Research: The task is included in studies investigating neural mechanisms underlying perspective taking and visual imagery. Researchers often combine behavioral measures with eye tracking or neurophysiological data to examine cognitive processing stages.


Sign Up


References

  • Dahm, S. F., Muraki, E. J., & Pexman, P. M. (2022). Hand and Foot Selection in Mental Body Rotations Involves Motor-Cognitive Interactions. Brain Sciences, 12(11), 1500.

  • Neubauer, A. C., Bergner, S., & Schatz, M. (2010). Two- vs. three-dimensional presentation of mental rotation tasks: Sex differences and effects of training on performance and brain activation. Intelligence, 38(5), 529–539.