Skip to content

Commit 677dc46

Browse files
committed
Add OpenXR hand tracking demo
1 parent 9a0c857 commit 677dc46

29 files changed

+2694
-0
lines changed
+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Ignore our Android build folder, should be installed by user if needed
2+
android/
3+
4+
# Ignore vendor plugin add on folder, should be installed by user if needed
5+
addons/godotopenxrvendors/
+159
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,159 @@
1+
# XR Hand Tracking Demo
2+
3+
This is a demo showing OpenXRs hand tracking and controller tracking logic.
4+
5+
Language: GDScript
6+
7+
Renderer: Compatibility
8+
9+
> Note: this demo requires Godot 4.3 or later
10+
11+
## Screenshots
12+
13+
![Screenshot](screenshots/hand_tracking_demo.png)
14+
15+
## How does it work?
16+
17+
Being able to see the players hands, and having those hands interact with elements in the environment are paramount to a good XR experience.
18+
19+
In this demo we look at the off the shelf logic for displaying a hand model that is automated based on either controller input or through optical tracking of the players hands.
20+
We also implement logic that allows interaction based on input from the action map that allows the user to pick up the blocks in this demo.
21+
22+
The problem this poses to us is that there have been two schools of thought around what hand tracking actually means,
23+
and depending on the XR runtime in use there may be gaps in functionality.
24+
25+
### Hand tracking is only for optical tracking
26+
27+
The first school of thought treats hand tracking as a separate system that purely focusses on optical hand tracking.
28+
The hand tracking API in OpenXR, even if reported as supported, will only provide data if optical hand tracking is used.
29+
30+
This means that when controllers are used, no data is available and you as a developer have to come up with your own
31+
solution for displaying a hand mesh and animating it according to controller input.
32+
Note that the current version of Godot XR Tools contains a full solution for this.
33+
34+
Equally in this line of thought, the action map is only applicable to controller input.
35+
You as a developer are responsible for implementing some means of gesture recognition when optical hand tracking is active.
36+
37+
This becomes extra nightmarish when support for both controller tracking and optical hand tracking needs to be supported in a single application.
38+
39+
### The unified approach
40+
41+
The second school of thought ignores the differences between controller tracking and optical hand tracking
42+
and treats them as two versions of the same.
43+
Especially with controllers like the Valve Index, or with various data gloves that are treated as controllers,
44+
there is no discernible difference here.
45+
46+
The hand tracking API is mostly used for visualising the players hand with bone positions either being inferred
47+
from controller input or matching the optical tracking.
48+
For advanced gesture recognition you would still use this data however it is now accessible regardless of
49+
the physical means in which this data is obtained.
50+
51+
At the same time, in this school of thought the action map system is seen as the primary means to gain input
52+
and is no longer restriced to input from controllers. The XR runtime is now responsible for recognising base
53+
gestures such as pinching and pointing resulting in inputs that can be bound in the action map.
54+
55+
OpenXR is moving towards this approach and this demo has been build in accordance with this however not all runtimes have been updated yet.
56+
57+
SteamVR has followed this approach for a long time and works out of the box, however SteamVR treats everything as controllers resulting in some short comings when a Quest is used over Meta Link or Steam Link and optical hand tracking is used.
58+
59+
Metas native Quest runtime on all versions of Quest now support OpenXRs "data source extension" which Godot enables when hand tracking is enabled.
60+
However Meta does not yet support OpenXRs "hand interaction profile extension" which is required.
61+
62+
Meta link is still trailing behind and does not support this brave new world **yet**.
63+
64+
For other runtimes like Picos, HTC, Varjos, Magic Leaps, etc. may or may not yet support the required extensions.
65+
66+
### Conclusion
67+
68+
Due to the wildgrowth in capabilities in XR runtimes,
69+
and there being no solid way to detect the full limitations of the platform you are currently on,
70+
Godot XR Tools does not have support for the hand tracking API and purely relies on its own inferred hand positioning approach.
71+
72+
However with more and more runtimes adopting these new extensions any solution that targets platforms with support,
73+
it is becoming possible to rely on the hand tracking API.
74+
75+
This demo project shows what that future looks like.
76+
77+
## Hand tracking API
78+
79+
As mentioned, the hand tracking API is at the center of visualising the users hand.
80+
In Godot 4.3 we overhauled the system so the XR Interface needs to convert hand tracking data to the Godot humanoid skeleton hand bone layout.
81+
This also means that this logic works both in WebXR, OpenXR and any other XR Interface that adds support for this feature.
82+
83+
Hand tracking now also makes use of the new Skeleton Modifier logic in Godot 4.3 however
84+
the skeleton is posed in the hands local space, while positioning is provided through a XRNode3D node.
85+
86+
This split is applied because:
87+
88+
* positioning is always within the local space of the XROrigin3D node
89+
* there are many use cases where the positioning may be ignored or modified
90+
91+
> Note that the trackers used for the hand tracking API are `/user/hand_tracker/left` and `/user/hand_tracker/right`.
92+
93+
## (Half) body Tracking API
94+
95+
Just an honerable mention of this, this is not part of this demo but Godot now also has support
96+
for half and full body tracking that includes hand tracking. This functionality however is only
97+
available on a limited number of XR runtimes.
98+
99+
## Action map
100+
101+
As mentioned, we're using the action map here for input however when optical hand tracking is used
102+
we rely on OpenXRs hand interaction profile extension. Without support for this extension this demo
103+
will not fully function.
104+
105+
This can be solved by checking that no interaction profile has been bound to our XRController3D node,
106+
and performing our own gesture detection.
107+
As this would greatly increase the complexity of this demo and the expectation is that this extension
108+
will soon see wide adoption, this is left out of the demo.
109+
110+
> Some headsets will support the simple controller when hand tracking is enabled.
111+
> The simple controller interaction profile doesn't support an input for grabbing.
112+
> In this scenario you can grab the cubes using the pinch gesture
113+
> (touch the tip of your thumb with the tip of our index finger).
114+
115+
We are not using the default action map and instead have created an action map specific to this use case.
116+
117+
There are only two actions needed for this example:
118+
- `pose` is used to position the XRController3D nodes and mapped to the grip pose in most cases
119+
- `pickup` is used as the input for picking up an object, and mapped accordingly.
120+
121+
The pickup logic itself is split into two components:
122+
123+
* `pickup_handler.gd/.tscn` is an Area3D node with logic that is added as a child to an XRController3D node and handles the logic for that hand to pick up objects in range.
124+
* `pickup_able_body.gd` is a script that can be added to a RigidBody3D node to make it possible to pick up/drop that object.
125+
126+
> Note that the trackers used by the action map are `left_hand` and `right_hand`.
127+
128+
### MSFT Hand interaction extension
129+
130+
Microsoft introduced a hand interaction extension that Godot now supports and is configured for this project.
131+
Several other vendors such as Meta have added support for this extension as well.
132+
133+
With this extension both grab gestures and pinch gestures are supported and you can thus pick up the blocks in this project by making a grab motion (making a fist).
134+
135+
### HTC Hand interaction extension
136+
137+
HTC introduced a hand interaction extension that Godot now support however this has not been implemented in this project.
138+
This extension introduces two new trackers requiring you to change the trackers on the XRController3D node to make this work.
139+
140+
## Local floor reference space
141+
142+
A final notable element is that this demo uses the local floor reference space.
143+
144+
With this reference space the XR runtime will center the player on the XROrigin3D node when the user triggers the recenter logic.
145+
The startup behavior is different between different XR runtimes, Quest will attempt to remember where you recentered last, while SteamVR tends to reset this to default.
146+
It can thus not be guaranteed the player is in the correct spot when the demo starts.
147+
Hence the instructions suggesting the user recenters.
148+
149+
## Running on PCVR
150+
151+
This project can be run as normal for PCVR. Ensure that an OpenXR runtime has been installed.
152+
This project has been tested with the Oculus client and SteamVR OpenXR runtimes.
153+
Note that Godot currently can't run using the WMR OpenXR runtime. Install SteamVR with WMR support.
154+
155+
## Running on standalone VR
156+
157+
You must install the Android build templates and OpenXR loader plugin and configure an export template for your device.
158+
Please follow [the instructions for deploying on Android in the manual](https://docs.godotengine.org/en/stable/tutorials/xr/deploying_to_android.html).
159+
Binary file not shown.

0 commit comments

Comments
 (0)