Hi all,
One more fun project :
Sharing a project I built using the Metis AIPU and Voyager SDK — a gesture-controlled racing game controller that uses hand tracking to play any keyboard-based racing game.
How it works:
- YOLO11n-pose-hands runs on Metis for real-time 21-keypoint hand detection
- Two hands detected = steering wheel mode — the tilt angle between your wrists controls steering direction
- Per-hand gesture recognition on CPU: fist = accelerate, open palm = brake
- Keyboard inputs are simulated via pynput, so it works with any game that uses arrow keys or WASD
Pipeline setup:
- Used the
hand-keypoints.yamlpipeline config withDecodeHandPosecustom postprocessor extendingDecodeYoloPosefor 21 hand keypoints instead of the default 17 COCO body keypoints - Gesture classification from keypoint geometry (finger extension/curl ratios relative to wrist)
- EMA smoothing + 15° deadzone on steering to prevent jitter

