Hi all - I wanted to share a small tool I’ve been building and get feedback from the ROS community.
Hand Tracking Streamer (HTS) is a free Meta Quest app that streams real-time hand landmarks (21 per hand, MediaPipe style) plus a 6-DoF wrist pose to a workstation. The goal is to make VR hand tracking usable as a practical input device for robotics workflows: teleoperation, demonstration logging, and simulation prototyping.

HTS installs directly from the Meta Quest Store and can run over wireless or wired networking, depending on your setup. It also comes with a Python SDK for parsing the stream into typed frames, conversion helpers, and simple visualization/logging tools:
pip install hand-tracking-sdk
Links
-
HTS (GitHub): https://github.com/wengmister/hand-tracking-streamer
-
Meta Quest Store: https://www.meta.com/experiences/hand-tracking-streamer/26303946202523164/
-
Python SDK + docs: https://github.com/wengmister/hand-tracking-sdk | https://hand-tracking-sdk.readthedocs.io/en/latest/
-
ROS2 bridge (WIP): GitHub - wengmister/hand-tracking-sdk-ros2: ROS2 SDK to consume hand tracking streamer telemetry
ROS2 Status:
I’m now building a ROS 2 package (hand_tracking_sdk_ros2). At the moment it’s a bridge node that publishes wrist pose and landmarks into ROS topics, publishes wrist TF, and includes an RViz visualization path so you can see the hand geometry immediately. The package is still under active development and I expect some breaking changes as the interfaces settle.
I’d love to learn from you: what features would make this most useful in your ROS 2 stack, and what examples or demos would be most helpful to publish first (e.g., RViz-only quickstart, arm teleop, simulation bridge, gesture/state machine, hand retargeting, etc.)? Ideas and PRs are very welcome.