Realtime Hand
Unity AR package to track your hand in realtime!
As seen on "Let's All Be Wizards!" : https://apps.apple.com/app/id1609685010
Features
- 60 FPS hand detection
- 3D Bones world detection
Sample Videos
SampleSmall.mov
SampleVideoLightning.mov
Requirements
- Unity 2020.3 LTS
- ARFoundation
- iPhone with Lidar support (iPhone 12+ Pro)
Installation
- Add the package
RealtimeHand
to your manifest - Add the
SwiftSupport
package to enable swift development - Check the
RealtimeHandSample
for usage
Classes
RTHand.Joint
screenPos
: 2D position in normalized screen coordinates;texturePos
: 2D position in normalized CPU image coordinates;worldPos
: 3D position in worldpsacename
: name of the joint, matching the native onedistance
: distance from camera in meterisVisible
: if the joint has been identified (from the native pose detection)confidence
: confidence of the detection
RTHand.RealtimeHandManager
Do most of the heavy work for you : just add it to your project, and subscribe to the ``HandUpdated` event to be notified when a hand pose has been detected
Steps:
- Create a GameObject
- Add the
RealtimeHandManager
component - Configure it with the
ARSession
,ARCameraManager
,AROcclusionManager
objects - Subscribe to
Action<RealtimeHand> HandUpdated;
to be notified
OcclusionManager must be configured with
temporalSmoothing=Off
andmode=fastest
for optimal result
RTHand.RealtimeHand
If you want to have a full control on the flow, you can manually intialize and call the hand detection process : more work, but more control.
Properties
IsInitialized
: to check if the object has been properly initialized (ie: the ARSession has been retrieved)IsVisivle
: to know if the hand is currently visible or notJoints
: dictionary of the all the joints
Functions
Initialize(ARSession _session, ARCameraManager _arCameraManager, Matrix4x4 _unityDisplayMatrix)
: initialize the object with the required components
The session must be in tracking mode
Dispose()
: release the component and it resourcesProcess( CPUEnvironmentDepth _environmentDepth, CPUHumanStencil _humanStencil )
: launch the detection method using the depth buffers
Check the RealtimeHandManager
as an example
Under the hood
When a camera frame is received :
- Execute synchronously
VNDetectHumanHandPoseRequest
to retrieve a 2D pose estimation from the OS - Retrieve the
environmentDepth
and'humanStencil
CPU images - From the 2D position of each bone, extract its 3D distance using the depth images to reconstruct a 3D position
References
- Linkedin Original Post : https://www.linkedin.com/posts/oliviergoguel_unity-arkit-arfoundation-activity-6896360209703407616-J3K7
- Making Of : https://www.linkedin.com/feed/update/urn:li:activity:6904398846399524864/
Revisions
- Fix compatibility with Unity 2020.3
- Added Lightning Shader & effects
- Initial Release