labvanced logoLabVanced
  • Research
    • Publications
    • Researcher Interviews
    • Use Cases
      • Behavioral Psychology
      • Personality & Social Psychology
      • Cognitive & Neuro Psychology
      • Developmental & Educational Psychology
      • Clinical & Health Psychology
      • Sports & Movement Psychology
      • Marketing & Consumer Psychology
    • Labvanced Blog
  • Technology
    • Feature Overview
    • Desktop App
    • Phone App
    • Precise Timing
    • Experimental Control
    • Eye Tracking
    • Multi User Studies
    • More ...
      • Questionnaires
      • Artificial Intelligence (AI) Integration
      • Mouse Tracking
      • Data Privacy & Security
  • Learn
    • Guide
    • Videos
    • Walkthroughs
    • FAQ
    • Release Notes
    • Documents
    • Classroom
  • Experiments
    • Public Experiment Library
    • Labvanced Sample Studies
  • Pricing
    • Pricing Overview
    • License Configurator
    • Single License
    • Research Group
    • Departments & Consortia
  • About
    • About Us
    • Contact
    • Downloads
    • Careers
    • Impressum
    • Disclaimer
    • Privacy & Security
    • Terms & Conditions
  • Appgo to app icon
  • Logingo to app icon
Learn
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
Guide
Videos
Walkthroughs
FAQ
Release Notes
Classroom
  • 中國人
  • Deutsch
  • Français
  • Español
  • English
  • Guide
    • GETTING STARTED

      • Objects
      • Events
      • Variables
      • Task Wizard
      • Trial System
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
    • FEATURED TOPICS

      • Randomization & Balance
      • Eye Tracking
      • Desktop App
      • Sample Studies
      • Participant Recruitment
      • API Access
        • REST API
        • Webhook API
        • WebSocket API
      • Other Topics

        • Precise Stimulus Timings
        • Multi User Studies
        • Head Tracking in Labvanced | Guide
    • MAIN APP TABS

      • Overview: Main Tabs
      • Dashboard
      • My Studies
      • Shared Studies
      • My Files
      • Experiment Library
      • My Account
      • My License
    • STUDY TABS

      • Overview: Study-Specific Tabs
      • Study Design
        • Tasks
        • Blocks
        • Sessions
        • Groups
      • Task Editor
        • Main Functions and Settings
        • The Trial System
        • Canvas and Page Frames
        • Objects
        • Object Property Tables
        • Variables
        • System Variables Tables
        • The Event System
        • Trial Randomization
        • Text Editor Functions
        • Eyetracking in a Task
        • Head Tracking in a Task
        • Multi-User Studies
      • Study Settings
        • Start Up and Main Settings
        • Browsers & Devices Settings
        • Experiment Features Settings
      • Description
        • More Details about Description Information
        • Images, Links, and References in Descriptions
      • Variables
      • Media
      • Translate
      • Run
      • Publish and Record
        • Requirements for Publishing a Study in Labvanced
        • Recruiting Participants and Crowdsourcing
        • License Selection and Confirmation
        • After Publishing Your Labvanced Study
      • Sharing
      • Participants
      • Dataview and Export
        • Dataview and Variable & Task Selection (OLD Version)
        • Accessing Recordings (OLD Version)
  • Videos
    • Video Overview
    • Getting Started in Labvanced
    • Creating Tasks
    • Element Videos
    • Events & Variables
    • Advanced Topics
  • Walkthroughs
    • Introduction
    • Stroop Task
    • Lexical Decision Task
    • Posner Gaze Cueing Task
    • Change Blindness Flicker Paradigm
    • Eye-tracking Sample Study
    • Infant Eye-tracking Study
    • Attentional Capture Study with Mouse Tracking
    • Rapid Serial Visual Presentation
    • ChatGPT Study
    • Eye Tracking Demo: SVGs as AOIs
    • Multi-User Demo: Show Subjects' Cursors
    • Gamepad / Joystick Controller- Basic Set Up
    • Desktop App Study with EEG Integration
  • FAQ
    • Features
    • Security & Data Privacy
    • Licensing
    • Precision of Labvanced
    • Programmatic Use & API
    • Using Labvanced Offline
    • Troubleshooting
    • Study Creation Questions
  • Release Notes
  • Classroom

Getting Started with Labvanced Basics

These questions are related to basic information about Labvanced and how studies can be realized on the platform.

What is Labvanced?

Labvanced is a platform for creating and recording professional behavioral experiments. Studies typically run online but can also be downloaded and executed offline. Labvanced is a leader in online experiment studies, offering many advanced tools and methods for creating and running psychology studies online, such as eye tracking, precision timing, and multi-user studies.

Who is already using Labvanced?

Since our founding in 2017, we have worked together with over 3000 researchers from all over the world and across various disciplines, from cognitive psychology to linguistics to clinical psychology and more. There are hundreds of studies running on Labvanced in all directions and with various degrees of complexity. Check out our Publications page for a list of peer-reviewed research papers, reviews, and even dissertations using and citing Labvanced.

How can I cite Labvanced?

Finger, H., Goeke, C., Diekamp, D., Standvoß, K., & König, P. (2017). LabVanced: a unified JavaScript framework for online studies. In International Conference on Computational Social Science (Cologne).

We are working on two further publications at the moment. We hope that those will be published soon.

What are the advantages of Labvanced compared to other experiment platforms?

Labvanced is a very powerful and convenient platform. Our graphic interface is user-friendly and can be learned quickly. Programmable options mean that even very complicated experiments can be realized using events and objects in sequence. We also have many license options, but simply building an experiment is always free. Please visit our feature page to find out more, or contact us for friendly assistance!

Are all experiments which run on Labvanced available in the Public Experiment Library?

No, there are many more studies running on Labvanced than are in the Public Experiment Library. The Public Experiment library contains mostly experiments from the Labvanced team, which you can use as templates, but also many from other researchers who chose to make their study publicly available as the practice of open science is becoming more commonplace and supported. Although most researchers prefer not to make their study publically available, we highly encourage publishing and sharing studies using the free experiment library.

What is the best way to get started with my experiment on Labvanced?

We suggest that you start by choosing a study from the Public Experiment Library that is most similar to the study you want to build, importing it to your account and dissecting its inner workings. Simultaneously you can open the video tutorial page, watch some of the tutorial videos, and start editing the experiment according to your needs.

Do I need to understand programming to be able to use Labvanced?

No. Programming is possible, but you can create complex experiments solely by using our graphical interface. However, understanding of algorithmic structures helps when using the event system to create complex logic.

Which kind of experiments can be created on Labvanced?

Combining the power of visual expressiveness and programmatic power, we are very confident that we support more than 99% of all behavioral experiments. Labvanced can support experiments that test language use, perceptual learning, cooperation, and many more behavioral concepts. If you are looking for a certain feature that you need for your study, please visit our feature overview page.

The experiment I plan to do is very complex. I am not sure whether this can be realized with Labvanced, what shall I do?

If you have just started with Labvanced, it can sometimes be hard to imagine how all the details can be realized with bigger projects. After spending some time with the platform, you will see that things will go quicker and quicker. We are quite certain that almost every behavioral study can be realized, but if you have doubts, please contact us via the live chat or using the contact page.

Which stimulus elements can be shown with Labvanced?

Currently Labvanced provides 20 different stimulus elements including Images, Videos, Audio, Form-Elements, Sortable-Lists, File-Uploads, an Editable Text element, and much more. Please see our feature page for a complete list and explanations.

Can I control who/which devices etc. can take part in my experiment?

Yes. There are various options to control for the demographics of the participants (gender, age, location, first language) and the properties of the participants’ device (mobile, computer, browser, resolution etc.).

Can I download and run a study offline if the selected device does not have internet?

Currently, the offline version is only available for users who own a Labvanced license (Premium or Group). More updates to this feature are on the way! For details about using Labvanced offline, please see "Using Labvanced Offline" under this heading.

Does Labvanced support 'Real Time Multi User Studies', to investigate game theoretical questions or cooperative/joint decision making?

Yes. During the study creation, you can customize what each participant should do and see using the System Variable “Role_Id”. Upon experiment execution, subjects have to wait in a virtual lobby until the right amount of participants are ready to perform the study together. Learn more about Multi-User Experiments.

Does Labvanced support eye tracking?

Yes! Webcam-based eye tracking is one of Labvanced’s most requested features. Because of its accuracy, more and more researchers are using it as a powerful way to gather physiological research in their remote studies. To learn more about the technology behind our webcam-based eye tracking, visit the Eye Tracking Technology page.

Does Labvanced support head tracking?

Yes! In addition to eye tracking, our head tracking feature is also used to study head orientation, position, and attentional processes. Because head tracking requires less calibration and is easier to implement in sensitive populations like children, it is a useful tool for remote studies that want to measure physiological data quickly. Visit this blog to find out more about head tracking research use cases.

Does Labvanced support Audio Recordings via Microphone?

Yes. Find a demo study here.

Does Labvanced support Webcam-based Eyetracking?

Yes. Find a demo study here.

Also, check out our YouTube playlist about eyetracking here.

Does Labvanced support Multilingual Studies (Study translated into several languages)?

Yes. You can create your study in your preferred (native) language first. Using the “Translation Menu” you can at any point automatically translate all the texts to many different languages. Of course you can adjust/improve the automatic translation as well. All these different (language) versions of the experiment will be accessible using the same URL, and right in the beginning participants can choose their preferred language. Learn more about the Translate Tab.

Next
Security & Data Privacy