home / skills / plurigrid / asi / xr

xr skill

/skills/xr

This skill helps you navigate AR and VR boundaries with XR guidance, addressing comfort, privacy, and platform constraints.

npx playbooks add skill plurigrid/asi --skill xr

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
755 B
---
name: xr
description: Extended reality (XR/MR) reality tech. Mixed reality and spatial computing across AR and VR capabilities.
license: Apache-2.0
metadata:
  trit: 0
  version: "1.0.0"
  bundle: reality-tech
---

# XR / MR (Extended or Mixed Reality)

Use when the user needs guidance across the AR/VR boundary: passthrough + world-locked content + immersive interaction.

Default to the umbrella skill `ar-vr-xr` unless the request is clearly AR-only (`ar`) or VR-only (`vr`).

Key concerns to surface:
- Comfort + locomotion choices
- Sensor privacy (camera, room mapping)
- Platform constraints (runtime, permissions, capability availability)
- Shared state correctness under faults: use `jepsen-testing`
- Device-specific guidance: `varjo-xr-4`

Overview

This skill provides practical guidance for building and deploying extended reality (XR/MR) experiences that sit between AR and VR. It focuses on mixed reality patterns: passthrough, world-locked content, and immersive interaction across devices. Use it to align design, privacy, and platform constraints for spatial computing projects.

How this skill works

The skill inspects the interaction context (passthrough vs fully immersive) and recommends architecture and UX tradeoffs for comfort, locomotion, and sensor privacy. It highlights platform-specific constraints, required permissions, and runtime capabilities, and suggests testing approaches for shared-state correctness and device-specific tuning. Outputs are concise design and engineering prescriptions you can apply directly to apps and prototypes.

When to use it

  • Designing experiences that combine real-world passthrough with world-locked virtual content
  • Choosing locomotion and comfort models for mixed reality applications
  • Assessing sensor privacy and room-mapping tradeoffs before release
  • Planning cross-platform XR projects where AR and VR overlap
  • Validating multi-device shared state and resilience under faults

Best practices

  • Default to an umbrella XR approach unless the brief is explicitly AR-only or VR-only
  • Prioritize user comfort: minimize vection, provide configurable locomotion, and include vignette/fade options
  • Explicitly surface camera and mapping permissions, and design local-first privacy controls
  • Document platform capability fallbacks and test on low-capability runtimes early
  • Use jepsen-style fault-injection and deterministic tests for shared-state synchronization
  • Tune visuals and tracking per device (e.g., optimize for high-fidelity Varjo-like headsets separately)

Example use cases

  • A mixed-reality training app that overlays step-by-step guidance on real equipment using passthrough
  • A collaborative spatial whiteboard that keeps annotations world-locked and synchronized across headsets
  • A prototyping checklist for porting an AR app to a semi-immersive MR headset with limited sensors
  • A privacy audit and permission UI for an XR app that requires room mapping and camera access
  • Stress-testing shared scene state across disconnections and race conditions using fault injection

FAQ

Should I treat XR as AR or VR when starting a project?

Treat it as XR by default. Only narrow to AR or VR when hardware or UX requirements clearly exclude the other mode.

How do I ensure comfort across devices?

Implement adjustable locomotion, reduce acceleration/rotation cues, offer seated/standing modes, and measure comfort in short iterative tests.