Hand Tracking

The basic input of HoloKit Unity SDK

*For guidance on the latest version, refer to the GitHub page.

In a HoloKit application, there are several ways to provide input for the user. For example, in MOFA series, we use Apple Watch to provide input. Hand tracking is the fundamental way of input that HoloKit SDK provides with no extra device needed. The hand tracking algorithm can track the positions of 21 landmarks for each hand. With hand tracking, you can implement some basic AR interactions.

Please notice that, in order to use hand tracking algorithm, you must have an iPhone with LiDAR sensor. The iPhone models which support LiDAR sensor are iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro, iPhone 13 Pro Max, iPhone 14 Pro and iPhone 14 Pro Max. Without a LiDAR enabled iPhone, you can still use the other functionalities of HoloKit SDK, only the hand tracking is disabled.

This section is a tutorial on how to implement the hand tracking sample project. You can directly import this sample project by clicking the import button as shown below.

The rest of this section will demonstrate how to implement the hand tracking sample step by step.

In the last section, we implemented the sterescopic rendering sample. The sterescopic rendering sample is in fact the basic setup of all HoloKit projects. Therefore, we should continue from the last sample scene.

Setup HoloKit Hand Tracker

To enable hand tracking, we only need to drag the HoloKitHandTracker prefab into the scene. You can find the prefab at Packages->HoloKit SDK->Assets->Prefabs.

Add AROcclusionManager

In order to run the hand tracking algorithm, we need to turn on the LiDAR sensor. In Unity, AROcclusionManager is the component to control the LiDAR sensor. Thus, we add AROcclusionManager component under the HoloKitCamera object.

You might notice some configurations in the above image. First, in EnvironmentDepthMode field, we chose Fastest. Running LiDAR sensor is very computationally expensive and would make the iPhone overheat fast. We chose Fastest to save the consumed energy. Second, we disabled TemporalSmoothing field. Themporal smoothing can smooth the occlusion edge, but it has no impact on the hand tracking algorithm. Third, we chose NoOcclusion in OcclusionPreferenceMode field. This is optional, we chose NoOcclusion to prevent the landmarks occluded by user's hands.

Build The Project

Now you are ready to go! Build the project onto your iPhone. Enter the StAR mode and insert your iPhone into a HoloKit. Raise your hand in front of you and you should see 21 landmarks indicating your hand's realtime position.

Last updated