HoloKit Docs
Buy HoloKit
  • 🌟HoloKit Overview
  • Product
    • 💭Concepts
      • What's Stereoscopic AR?
        • Mixed Reality Spectrum
        • OST vs VST
      • Why We Design HoloKit X?
      • Why iPhone Only?
      • Journey of HoloKit X
    • đŸĨŊHoloKit X
      • đŸ› ī¸Technologies
        • 🤏Hand Tracking
        • đŸ“ŗHaptics Feedback
        • 📎App-less WebXR Browser
        • 🍏Apple Exclusive Technology
        • âąī¸Low-Latency 6DOF Spatial Tracking
        • 🔗Multipeer Connectivity
        • 🔊Spatial Audio and Audio Spatialization
        • đŸ™ī¸Environmental Understanding based on LiDAR
        • 👀Stereoscopic Rendering
      • đŸ•šī¸Input Methods
        • đŸ—Ŗī¸Speech Recognition
        • đŸ‘ī¸Eye Gazing
        • 🤏Hand Tracking
        • 👆TouchBar Control
        • 🎮Wireless Game Controller (Optional)
        • âŒ¨ī¸Wireless Keyboard (Optional)
        • ⌚Apple Watch (Optional)
        • đŸĒ„Arduino sensors (Optional)
      • 📃Specifications
      • 📱Compatible iPhones
      • 🎧Compatible Audio
    • 📲HoloKit App
      • 🎲Games and Demos
        • 🐉Dragon Hunting
        • đŸĒ„Wizard Duel
        • 🔤Typed Reality
      • âœˆī¸App-less WebXR Launcher
    • 📖How-to Instructions
    • ❓FAQs
  • Realities
    • Gallery of Realities
    • WebXR Gallery
    • For Creators
    • For Educators
    • For Researchers
    • Design New Reality
  • Creators
    • 🔌Unity SDK
      • Overview
      • Prerequisite
      • How to Get HoloKit Unity SDK
      • Stereoscopic Rendering
      • Hand Tracking
      • Advanced Use of Hand Tracking
      • Other Useful Tips
    • đŸ’ŧTutorials
      • đŸ‘¯Tutorial 5: Multiplayer AR
      • 💈Tutorial 4: Interact with holographical avatar with ChatGPT
      • 🌎Tutorial 3: Use Google Geospatial anchors to position real-world content
      • 🎡Tutorial 2: Use Immersal's Visual Positioning System for AR Exhibition
      • đŸ–ī¸Tutorial 1: Use Hand Tracking for Interacting with Holograms
    • 🛂Support
  • About
    • âšĒAbout us
    • đŸ‘ŧApply Evangelists for HoloKit
    • đŸ’ŦMedia Inquiry
      • đŸ—žī¸Media Coverage
        • HoloKit X Media Coverage
        • HoloKit 1 Media Coverage
      • 📄Fact Sheet
        • About the Company
        • About the Product
    • đŸĨ‡Awards
    • đŸ”ļMedia Assets
      • Branding Guidelines
        • Logo Guidelines
        • Logo for Partnership
        • Slogan
  • Legal
    • Terms
    • Use, Health & Safety Guideline
    • Shipping and Return Policy
    • Privacy Policy
    • Terms of Service
Powered by GitBook
On this page
  • Setup HoloKit Hand Tracker
  • Add AROcclusionManager
  • Build The Project
  1. Creators
  2. Unity SDK

Hand Tracking

The basic input of HoloKit Unity SDK

PreviousStereoscopic RenderingNextAdvanced Use of Hand Tracking

Last updated 1 year ago

*For guidance on the latest version, refer to the GitHub page.

In a HoloKit application, there are several ways to provide input for the user. For example, in MOFA series, we use Apple Watch to provide input. Hand tracking is the fundamental way of input that HoloKit SDK provides with no extra device needed. The hand tracking algorithm can track the positions of 21 landmarks for each hand. With hand tracking, you can implement some basic AR interactions.

Please notice that, in order to use hand tracking algorithm, you must have an iPhone with LiDAR sensor. The iPhone models which support LiDAR sensor are iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro, iPhone 13 Pro Max, iPhone 14 Pro and iPhone 14 Pro Max. Without a LiDAR enabled iPhone, you can still use the other functionalities of HoloKit SDK, only the hand tracking is disabled.

This section is a tutorial on how to implement the hand tracking sample project. You can directly import this sample project by clicking the import button as shown below.

The rest of this section will demonstrate how to implement the hand tracking sample step by step.

In the last section, we implemented the sterescopic rendering sample. The sterescopic rendering sample is in fact the basic setup of all HoloKit projects. Therefore, we should continue from the last sample scene.

Setup HoloKit Hand Tracker

To enable hand tracking, we only need to drag the HoloKitHandTracker prefab into the scene. You can find the prefab at Packages->HoloKit SDK->Assets->Prefabs.

Add AROcclusionManager

In order to run the hand tracking algorithm, we need to turn on the LiDAR sensor. In Unity, AROcclusionManager is the component to control the LiDAR sensor. Thus, we add AROcclusionManager component under the HoloKitCamera object.

You might notice some configurations in the above image. First, in EnvironmentDepthMode field, we chose Fastest. Running LiDAR sensor is very computationally expensive and would make the iPhone overheat fast. We chose Fastest to save the consumed energy. Second, we disabled TemporalSmoothing field. Themporal smoothing can smooth the occlusion edge, but it has no impact on the hand tracking algorithm. Third, we chose NoOcclusion in OcclusionPreferenceMode field. This is optional, we chose NoOcclusion to prevent the landmarks occluded by user's hands.

Build The Project

Now you are ready to go! Build the project onto your iPhone. Enter the StAR mode and insert your iPhone into a HoloKit. Raise your hand in front of you and you should see 21 landmarks indicating your hand's realtime position.

🔌
The algorithm can track 21 landmarks of the hand
Click the import button to import the hand tracking sample
Add HoloKitHandTracker prefab into the scene
Add AROcclusionManager to turn on LiDAR sensor