-
Notifications
You must be signed in to change notification settings - Fork 0
Template Project: Data Collection Module
Target Audience: Developers, Researchers, Data Analysts
The Data Collection Module is a standalone, reusable Unity package extracted directly from the Magic Xroom. It provides a standardized architecture for capturing, synchronizing, and exporting multimodal data within a SteamVR-based application.
By utilizing this template, developers can bypass the overhead of writing custom hardware integrations and focus directly on how physiological, kinematic, and biometric data influence their specific XR training environments.
To ensure the module handles external data streams correctly, the host Unity project must meet the following configuration requirements:
- Operating System: Windows 10 or 11 (x64).
- SteamVR: Required as the core tracking and input framework.
-
Libraries:
- Google Protobuf: Required for structured data serialization (can be imported via the GameWorkstore Unity port).
-
System.IO.Ports: Crucial for the Shimmer COM port connection. You must either set the Unity API Compatibility Level to
.NET Framework(instead of.NET Standard 2.1), manually drop theSystem.IO.Ports.dllinto your plugins folder, or install it via NuGetForUnity. - SRanipal Runtime: Required specifically for the eye and face tracking functionalities (must be the Steam version, and the only version installed to avoid interference).
The module captures data across three distinct hardware tiers and separates the output into dedicated CSV files to prevent data entanglement. The files follow a strict naming convention:
data_collection_<session ID>_<sensor type>.csv
Understanding the Session ID: The <session ID> is a highly precise identifier generated based on the number of ticks (100-nanosecond intervals) elapsed since midnight, January 1, 0001 (01.01.0001). This guarantees a globally unique identifier for every recording session that can be parsed directly back into a standard C# DateTime object for chronological sorting during analysis.
- Source: SteamVR-compatible Head-Mounted Displays (HMDs) and 6DoF hand controllers.
- Data Purpose: Tracking user physical movement, posture, and interaction speed within the spatial environment.
-
Key Columns: Captures absolute coordinates in the world space.
-
timestamp: The application context time. -
head_pos_[x,y,z]: HMD absolute position coordinates. -
head_rot_[x,y,z,w]: HMD absolute rotation, stored as quaternions to prevent gimbal lock. -
lcontroller_pos_[x,y,z]&lcontroller_rot_[x,y,z,w]: Left controller kinematics. -
rcontroller_pos_[x,y,z]&rcontroller_rot_[x,y,z,w]: Right controller kinematics.
-
- Source: Shimmer3 GSR+ device (must be paired via Bluetooth and running the LogAndStream firmware).
- Data Purpose: Estimating user stress, emotional arousal, and cognitive load during specific scenarios.
-
Key Columns:
-
timestamp: The application context time. -
int_timestamp: The Shimmer unit's internal hardware clock. -
accel_[x,y,z]: Tri-axial accelerometer data from the unit. -
gsr: Galvanic Skin Response (EDA) to measure sympathetic nervous system arousal. -
ppg: Photoplethysmograph sensor data (blood volume pulse). -
hr: Real-time Heart Rate calculated directly from the PPG stream.
-
- Source: Headsets with integrated biometric trackers (e.g., Vive Focus 3, Vive Pro Eye) utilizing the SRanipal runtime.
- Data Purpose: Analyzing visual attention, reading patterns, focus depth, and emotional expressions.
-
Key Columns: Vector data is formatted using right-handed coordinate systems.
-
timestamp: The application context time. -
int_timestamp: The Eye Tracker's internal hardware clock. -
left_gaze_origin_[x,y,z]: Left eye cornea center relative to the lens center (measured in mm). -
left_gaze_dir_norm_[x,y,z]: Left eye normalized directional vector of the gaze [range 0 to 1]. - Note: The structure is identical for the
right_gaze_...columns. If lip/face tracking modules are active, the data streams will dynamically include facial blendshape weights.
-
A critical component of this module is how it aligns asynchronous hardware data streams. While the primary Unity application runs on a single main thread, external sensors provide data with varying delays.
To solve this, the logger records two distinct timestamps where applicable:
-
timestamp: The application's synchronized timestamp taken from the Unity context at the moment the data is written to the file. -
int_timestamp: The internal hardware timestamp generated by the peripheral device (available for Shimmer and Eye Tracking).
Researchers analyzing the data must compare the timestamp and int_timestamp to determine if polling delays from a specific sensor require data interpolation, or if the latency is negligible for the scope of the study.
Alongside the raw continuous data streams, the module captures discrete session events. This allows researchers to overlay timeline markers onto the telemetry data.
Customization Note: The categories and string labels listed below represent the module's default configuration, which was originally built around a "scenario" and "level" architecture. These are strictly examples. The logging framework is highly flexible, allowing developers to define and inject custom event strings that better reflect their specific application logic and research requirements.
By default, the monitored events fall into four categories:
-
Scenario: Logs when a main scenario is started manually (
SCENARIO_STARTED) or concluded (SCENARIO_ENDED). -
Level: Tracks sub-tasks within a scenario (
LEVEL_STARTED,LEVEL_COMPLETED,LEVEL_FAILED). -
Teleport: Records movement between spatial boundaries (
TELEPORT_IN,TELEPORT_OUT). -
Feedback: Captures explicit user emotional states inputted at the end of a sequence (
ENGAGED,BORED,FRUSTRATED, orSKIP).
Whenever a major event (like SCENARIO_STARTED) is triggered, the logger automatically injects a Shimmer int_timestamp marker to force a synchronization anchor between the application events and the Shimmer hardware data stream.
For source code, hardware manuals, and further framework documentation, please refer to the following resources:
Project Repositories:
Related Wiki Pages:
|
Wiki - Immerse yourself in the world of XR2Learn |
- XR2Learn Platform Overview
- Tutorial: Authenticating with XR2Learn
- Marketplace
- Content Catalogue
- Community Forum
- Tutorial: Learning Path
- INTERACT Plugin
- Tutorial: Quick start with INTERACT
- Tutorial: Personalization Enablers - Getting Started
- Personalization Enablers Overview
- Tutorial: Which personalization enablers do I need?
- Personalization Enablers
* work in progress