HoloKinect (2021)
Next-level telepresence in AR
I used Unity and WebRTC to achieve real-time streaming of point clouds from Azure Kinect to HoloLens 2. The app gives a teleportation-like experience to the users, where the user can be transported from anywhere in the world to where the Kinect is.
Much of the fast rendering was achieved thanks to RGB-Depth video and HLSL Shader. RGB-D video allowed the streaming of depth information over the network much faster than simply streaming the depth float array. The shader helped efficiently convert RGB-D images to point clouds, which created a great overhead for the CPU.
What I’ve done
Implemented streaming pipeline with Unity and WebRTC; Developed AR client software for HoloLens 2 in Unity.