Gesture libraries

Where multimodal interaction enters the stack

Gesture libraries should not feel like isolated device adapters. They should be the typed interaction layer that feeds the same orchestration system as text, voice, automation events, and tool results.

Sense

Capture taps, holds, slides, tilt, haptic output requests, and other multimodal interaction signals.

Normalize

Convert device-specific details into stable event types your app can rely on across transports and products.

Interpret

Map gestures to app intents, tool affordances, navigation actions, or loop control signals.

Respond

Return haptics, UI updates, confirmations, and state changes so the interaction feels closed-loop and legible.

What makes a good gesture API

  • • Stable event types that are easy to serialize and test
  • • Clear separation between raw sensor data and interpreted app intents
  • • Haptic output primitives that pair naturally with gesture input
  • • Consistent semantics across simulation and physical hardware
  • • Documentation that teaches patterns, not just enums and method names

Why this matters

Gestures become truly reusable when they are modeled as part of application orchestration. That is how they move from product feature to open-source library.

Suggested implementation flow

  1. Define a typed event model for gestures and haptics.
  2. Feed those events into the same state and orchestration layer as other inputs.
  3. Use simulator-backed tests to validate interaction semantics early.
  4. Document the event model in Rustdoc and the integration patterns in this portal.
Explore simulator workflows See examples and patterns