Simulator

Simulator-first development for the Gestura stack

The simulator should be more than a hardware preview. It should be one of the fastest ways to learn the event model, validate the agentic loop, test gesture semantics, and exercise protocol integrations before everything depends on physical devices.

What the simulator is for

Validate event semantics

Use the simulator to verify that taps, slides, tilt, haptics, and multimodal signals become the typed events your application expects.

Test the agentic loop

Run loop flows without waiting on hardware: collect input, resolve context, execute tools, inspect state transitions, and confirm final responses.

Develop gesture libraries

Iterate on gesture interpretation, event normalization, and feedback patterns before locking behavior to a device-specific implementation.

Exercise protocol boundaries

Use simulator-backed workflows to test MCP tools, resources, approvals, and external capability exposure in a safer environment.

Recommended workflow

  1. Pick the interaction or loop behavior you want to validate.
  2. Drive the app with simulated gesture, haptic, or protocol events.
  3. Observe how those events move through the core libraries and orchestration layers.
  4. Confirm the resulting UI, tool actions, and feedback semantics are understandable.
  5. Only then depend on physical hardware for final tuning and real-world behavior checks.

Core rule

Use the simulator to prove semantics and system behavior early. Use hardware later to tune feel, fidelity, and real-world interaction nuances.

High-value validation scenarios

Gesture mapping

Verify that raw interactions become stable application intents before they affect navigation, tools, or agent state.

Loop observability

Watch iteration boundaries, approvals, tool results, and response streaming to confirm the loop stays legible to developers and users.

Haptic feedback semantics

Check that feedback patterns match the event they acknowledge: confirmation, warning, progress, completion, or failure.

MCP capability exposure

Validate that protocol-visible tools and resources remain intentionally small, safe, and aligned with internal policy.

What a strong simulator flow should expose

  • • The exact typed events being emitted or consumed
  • • The state changes those events trigger in the loop or app shell
  • • The haptic or UI feedback returned to the user
  • • Any policy, approvals, or tool execution checkpoints
  • • A clear path from simulator behavior to generated Rustdoc for exact APIs

Why this matters for open source

Open-source contributors learn faster when they can observe a system before they have to fully assemble it. A good simulator narrows that gap: it makes the interaction model concrete, gives contributors something to validate against, and helps examples, protocol bridges, and gesture libraries converge on the same semantics.