Awesome
HoloLens 2 Sensor Streaming
HoloLens 2 server software and Python client library for streaming sensor data via TCP. Created to stream HoloLens data in real time over WiFi to a Linux machine for research purposes but also works on Windows and macOS. The server is offered as a standalone application (appxbundle) or as a plugin (dll) compatible with Unity, Unreal, and native UWP applications.
Supported interfaces
- Research Mode Visible Light Cameras (640x480 @ 30 FPS, Grayscale, H264 or HEVC encoded)
- Left Front
- Left Left
- Right Front
- Right Right
- Research Mode Depth
- AHAT (512x512 @ 45 FPS, 16-bit Depth + 16-bit AB, H264 or HEVC encoded or Lossless* Zdepth for Depth)
- Long Throw (320x288 @ 5 FPS, 16-bit Depth + 16-bit AB, PNG encoded)
- Research Mode IMU
- Accelerometer (m/s^2)
- Gyroscope (deg/s)
- Magnetometer
- Front Camera (1920x1080 @ 30 FPS, RGB, H264 or HEVC encoded)
- Microphone (2 channels @ 48000 Hz, 16-bit PCM, AAC encoded or 5 channels @ 48000 Hz, 32-bit Float)
- Spatial Input (30 Hz)
- Head Tracking
- Eye Tracking
- Hand Tracking
- Spatial Mapping (3D Meshes)
- Scene Understanding (3D Meshes + Semantic labels for planar surfaces)
- Voice Input
- Extended Eye Tracking (30, 60, or 90 FPS)
- Extended Audio (Microphone + Application audio, 2 channels @ 48000 Hz, 16-bit PCM, AAC encoded)
- Internal Microphone mirror
- External USB-C Microphone
- Extended Video
- Internal Front Camera mirror
- External USB-C Camera
Additional features
- Download calibration data (e.g., camera intrinsics, extrinsics, undistort maps) for the Front Camera and Research Mode sensors (except RM IMU Magnetometer).
- Optional per-frame pose for the Front Camera and Research Mode sensors.
- Support for Mixed Reality Capture (Holograms in Front Camera video).
- Support for Shared capture for Front Camera and Extended Video.
- Client can configure the bitrate and properties of the H264, HEVC, and AAC encoded streams.
- Client can configure the resolution and framerate of the Front Camera. See here for a list of supported configurations.
- Client can configure the focus, white balance, and exposure of the Front Camera [example].
- Frame timestamps can be converted to Windows FILETIME (UTC) for external synchronization [example].
- Client can exchange messages with a Unity, Unreal, or native UWP application using the plugin [example].
- Server application can run in background (alongside other applications) when running in flat mode [example].
Integrations
- C++ client library.
- MATLAB and Simulink client library.
- Unity client plugin.
- LabVIEW client library.
- Android client library. (Experimental)
- hl2da plugin: access sensor data from Unity, Unreal, and native UWP applications running on the HoloLens.
Technical Report
Our paper provides an overview of the code, features, and examples for the first released version of the application (1.0.11.0). For newer versions, please refer to the examples in the viewer directory. If hl2ss is useful for your research, please cite our report:
@article{dibene2022hololens,
title={HoloLens 2 Sensor Streaming},
author={Dibene, Juan C and Dunn, Enrique},
journal={arXiv preprint arXiv:2211.02648},
year={2022}
}
Preparation
Before using the server software configure your HoloLens as follows:
- Update your HoloLens: Settings -> Update & Security -> Windows Update.
- Enable developer mode: Settings -> Update & Security -> For developers -> Use developer features.
- Enable device portal: Settings -> Update & Security -> For developers -> Device Portal.
- Enable research mode: Refer to the Enabling Research Mode section in HoloLens Research Mode.
Please note that enabling Research Mode on the HoloLens increases battery usage.
Installation (sideloading)
The server application is distributed as a single appxbundle file and can be installed using one of the two following methods.
Method 1 (local)
- On your HoloLens, open Microsoft Edge and navigate to this repository.
- Download the latest appxbundle.
- Open the appxbundle and tap Install.
Method 2 (remote)
- Download the latest appxbundle.
- Go to the Device Portal and navigate to Views -> Apps. Under Deploy apps, select Local Storage, click Browse, and select the appxbundle.
- Click Install, wait for the installation to complete, then click Done.
You can find the server application (hl2ss) in the All apps list.
Permissions
The first time the server runs it will ask for the necessary permissions to access sensor data. If there are any issues please verify that the server application (hl2ss.exe) has access to:
- Camera (Settings -> Privacy -> Camera).
- Eye tracker (Settings -> Privacy -> Eye tracker).
- Microphone (Settings -> Privacy -> Microphone).
- User movements (Settings -> Privacy -> User movements).
Python client
The Python scripts in the viewer directory demonstrate how to connect to the server, receive the data, unpack it, and decode it in real time. Additional samples show how to associate data from multiple streams. Run the server on your HoloLens and set the host variable of the Python scripts to your HoloLens IP address.
Interfaces
- RM VLC: viewer/client_stream_rm_vlc.py
- RM Depth AHAT: viewer/client_stream_rm_depth_ahat.py
- RM Depth Long Throw: viewer/client_stream_rm_depth_longthrow.py
- RM IMU: viewer/client_stream_rm_imu.py
- Front Camera: viewer/client_stream_pv.py
- Microphone (2 channels): viewer/client_stream_microphone.py
- Microphone Array (5 channels): viewer/client_stream_microphone_array.py
- Spatial Input: viewer/client_stream_si.py
- Remote Configuration: viewer/client_ipc_rc.py
- Spatial Mapping: viewer/client_ipc_sm.py
- Scene Understanding: viewer/client_ipc_su.py
- Voice Input: viewer/client_ipc_vi.py
- Unity Message Queue: viewer/client_ipc_umq.py (Plugin Only)
- Extended Eye Tracking: viewer/client_stream_eet.py
- Extended Audio: viewer/client_stream_extended_audio.py
- Extended Video: viewer/client_stream_extended_video.py
- Guest Message Queue: viewer/client_ipc_gmq.py (Plugin Only)
Required packages
Optional packages
TouchDesigner client
Provides set of TouchDesigner components for receiving and working with hl2ss data in real time. Main scene with all components and samples can be found in the hl2ss_td directory. Core component (hl2ss_core) is basically a wrapper around existing Python client - enabling its usage within TouchDesigner. User can easily build customized setup by using multiple hl2ss_core components, where each of them provides single data stream.
Before using the component make sure to create venv in hl2ss_td directory and install packages from requirements.txt using the same Python version as used by your TouchDesigner installation.
Requires TouchDesigner 2023.11880+ (tested on Windows).
Unity plugin
For streaming sensor data from a Unity application. A sample Unity project (2020.3.42f1) can be found in the hl2ss_unity directory.
Build and run the sample project
- Open the project in Unity. If the MRTK Project Configurator window pops up just close it.
- Go to Build Settings (File -> Build Settings).
- Switch to Universal Windows Platform.
- Set Target Device to HoloLens.
- Set Architecture to ARM64.
- Set Build and Run on Remote Device (via Device Portal).
- Set Device Portal Address to your HoloLens IP address (e.g., https://192.168.1.7) and set your Device Portal Username and Password.
- Click Build and Run. Unity may ask for a Build folder. You can create a new one named Build.
Adding the plugin to an existing project
- Download the latest plugin zip file and extract the Assets folder into your Unity project folder.
- In the Unity Editor configure the hl2ss, Eye Tracking, and Scene Understanding DLLs as UWP ARM64.
- In the Project window navigate to Assets/Plugins/WSA, select the DLL, and then go to the Inspector window.
- Set SDK to UWP.
- Set CPU to ARM64.
- Click Apply.
- Add the Hololens2SensorStreaming.cs script to the Main Camera.
- Enable the following capabilities (Edit -> Project Settings -> Player -> Publishing Settings):
- InternetClientServer
- InternetClient
- PrivateNetworkClientServer
- Webcam
- Microphone
- Spatial Perception
- Gaze Input
- The plugin also requires the perceptionSensorsExperimental and backgroundSpatialPerception capabilities, which are not available in the Publishing Settings capabilities list. The Editor folder in the plugin zip file contains a script (BuildPostProcessor.cs) that adds the capabilities automatically after building the project. Just extract the Editor folder into the Assets folder of your Unity project. Alternatively, you can manually edit the Package.appxmanifest after building. See here for an example.
Remote Unity Scene
The plugin has basic support for creating and controlling 3D primitives and text objects via TCP for the purpose of sending feedback to the HoloLens user. See the unity_sample Python scripts in the viewer directory for some examples. Some of the supported features include:
- Create primitive: sphere, capsule, cylinder, cube, plane, and quad.
- Set active: enable or disable game object.
- Set world transform: position, rotation, and scale.
- Set local transform: position, rotation, and scale w.r.t. Main Camera.
- Set color: rgba with support for semi-transparency.
- Set texture: upload png or jpg file.
- Create text: creates a TextMeshPro object.
- Set text: sets the text, font size and color of a TextMeshPro object.
- Text to speech: upload text.
- Remove: destroy game object.
- Remove all: destroy all game objects created by the plugin.
To enable this functionality add the RemoteUnityScene.cs script to the Main Camera and set the Material field to BasicMaterial. Alternatively, the mrtk_remote_ui project enables creating simple User Interfaces (windows with controls) remotely to interact with the HoloLens user.
Unreal plugin
For streaming sensor data from an Unreal application. A sample Unreal project (4.27.2) can be found in the hl2ss_unreal directory.
Build and run the sample project
- Open the project in Unreal and rebuild all missing modules. Ignore the Level_BuiltData error.
- Open Project Settings (Edit -> Project Settings). Navigate to Platforms -> HoloLens. Under Packaging -> Signing Certificate, click Generate New. In the Create Private Key Password window that appears click None. Close Project Settings.
- Package the project for HoloLens (File -> Package Project -> HoloLens). Unreal may ask for a destination folder. You can create a new one named Package.
- Install the hl2ss_unreal.appxbundle (generated in the Package/HoloLens folder) on your HoloLens.
- Run the hl2ss unreal app (located in the All apps list).
Adding the plugin to an existing project
- Download the latest plugin zip file and extract the Plugins folder into your Unreal project folder.
- Enable the hl2ss plugin (Edit -> Plugins). Restart the Editor if prompted.
- Add "hl2ss" to PublicDependencyModuleNames in the project .Build.cs.
- Enable the following capabilities (Edit -> Project Settings -> Platforms -> HoloLens):
- Internet Client
- Internet Client Server
- Private Network Client Server
- Microphone
- Webcam
- Gaze Input
- Spatial Perception
- Add
+DeviceCapabilityList=backgroundSpatialPerception
to Config/HoloLens/HoloLensEngine.ini. See here for an example.
Build from source and deploy
Building the server application and the plugin requires a Windows 10 machine.
- Install the tools.
- Open the Visual Studio solution (sln file in the hl2ss folder) in Visual Studio 2022.
- Set build configuration to Release ARM64. Building for x86 and x64 (HoloLens emulator), and ARM is not supported.
- Disable winrt::hresult_error in Debug -> Windows -> Exception Settings -> C++ Exceptions.
- Right click the hl2ss project and select Properties. Navigate to Configuration Properties -> Debugging and set Machine Name to your HoloLens IP address.
- Build (Build -> Build Solution). If you get an error saying that hl2ss.winmd does not exist, copy the hl2ss.winmd file from etc into the hl2ss\ARM64\Release\hl2ss folder.
- Run (Remote Machine). You may need to pair your HoloLens first.
The server application will remain installed on the HoloLens even after power off. The plugin is in the hl2ss\ARM64\Release\plugin folder. If you wish to create the server application appxbundle, right click the hl2ss project and select Publish -> Create App Packages.
Known issues and limitations
- Multiple streams can be active at the same time but only one client per stream is allowed.
- Spatial Input is not supported in flat mode.
References
This project uses the HoloLens 2 Research Mode API and the Cannon library, both available at the HoloLens2ForCV repository. Lossless* depth compression enabled by the Zdepth library.