home / skills / plurigrid / asi / ar

ar skill

/skills/ar

This skill helps you design and implement web and native augmented reality experiences with stable world-locked content, proper permissions, and safe

npx playbooks add skill plurigrid/asi --skill ar

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
1.1 KB
---
name: ar
description: Augmented reality (AR) reality tech. Passthrough overlays, hit-test/anchors, occlusion, lighting, camera permissions, and safety.
license: Apache-2.0
metadata:
  trit: 0
  version: "1.0.0"
  bundle: reality-tech
---

# AR (Augmented Reality)

Use when the user is building AR on WebXR (immersive-ar) or native (ARKit/ARCore/OpenXR).

## Fast Defaults

- Prefer stable world-locked content; avoid screen-locked UI unless necessary
- Use clear permission prompts and visible camera-on indicators
- Start with: hit-test + placement + anchors + simple interaction

## AR Capability Checklist

- Plane / mesh detection
- Hit-test (placing content onto surfaces)
- Anchors (persistent placement)
- Occlusion (depth) and lighting estimation if available
- Safety boundaries (keep critical UI in view, avoid distraction)

## Debug Checklist

- Confirm session type (WebXR `immersive-ar`) or native runtime
- Verify tracking origin and world scale (1 unit = 1 meter)
- Confirm anchor lifetimes and relocalization handling

See also: `ar-vr-xr` for the full workflow and shared performance/comfort guidance.

Overview

This skill covers building practical augmented reality (AR) experiences for WebXR (immersive-ar) and native platforms (ARKit/ARCore/OpenXR). It focuses on core AR features like passthrough overlays, hit-testing, anchors, occlusion, lighting estimation, camera permissions, and safety practices. The guidance helps you start with stable world-locked content and grow to persistent, well-behaved AR scenes.

How this skill works

The skill inspects and guides implementation of surface detection (planes or meshes), hit-test placement, and anchors for persistent content. It also verifies occlusion/depth handling and lighting estimation when available, and enforces camera permission prompts and visible camera-on indicators. Debug checks validate session type, tracking origin, scale assumptions, and anchor relocalization behavior.

When to use it

  • Building WebXR immersive-ar experiences or native AR apps (ARKit/ARCore/OpenXR).
  • Placing virtual objects into real-world surfaces with hit-test and anchors.
  • Implementing occlusion and lighting to improve realism.
  • Needing clear camera permission flows and visible camera indicators.
  • Designing safe, world-locked interfaces that avoid distracting the user.

Best practices

  • Prefer stable world-locked content; avoid screen-locked UI unless required.
  • Start with hit-test + placement + anchors before adding complex interactions.
  • Show clear permission prompts and a persistent camera-on indicator.
  • Use lighting estimation and occlusion where supported to increase believability.
  • Keep critical UI in the user’s field of view and enforce safety boundaries.
  • Test tracking origin, world scale (1 unit = 1 meter), and anchor relocalization.

Example use cases

  • Place and persist furniture or product models in a room using hit-test and anchors.
  • Overlay maintenance instructions on machinery with occlusion and lighting adjustments.
  • Create an AR navigation aid that stays world-locked and respects safety boundaries.
  • Build a medical training scene where accurate scale and anchor stability matter.
  • Prototype passthrough overlays for mixed-reality visualizations with permission indicators.

FAQ

How do I choose between world-locked and screen-locked UI?

Prefer world-locked UI for spatial tasks; use screen-locked UI only for constant controls or when safety requires it.

What to test for anchors during relocalization?

Verify anchor lifetime, whether anchors survive app restarts or device relocalization, and fallback behavior if relocalization fails.