home / skills / willsigmon / sigstack / ios-simulator-qa

ios-simulator-qa skill

/plugins/testing/skills/ios-simulator-qa

This skill automates iOS Simulator QA using simctl and Claude Vision to capture screenshots, test UI, and validate visuals across devices.

npx playbooks add skill willsigmon/sigstack --skill ios-simulator-qa

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
3.3 KB
---
name: iOS Simulator QA
description: iOS Simulator automation - screenshot capture, UI testing, visual QA for SwiftUI apps
allowed-tools: Read, Edit, Bash, mcp__xcode__build, mcp__xcode__run_tests
model: sonnet
---

# iOS Simulator QA Expert

Automate iOS testing and visual QA using the Simulator.

## For Vibe Coders
You don't need to know Xcode testing frameworks. Use simctl commands and Claude Vision to QA your iOS apps.

## Simulator Control (simctl)

### List Simulators
```bash
xcrun simctl list devices
# Shows available iPhone/iPad simulators
```

### Boot Simulator
```bash
# Boot specific device
xcrun simctl boot "iPhone 15 Pro"

# Open Simulator app
open -a Simulator
```

### Screenshot
```bash
# Capture current screen
xcrun simctl io booted screenshot ~/Desktop/ios-qa.png

# With specific simulator
xcrun simctl io "iPhone 15 Pro" screenshot ~/Desktop/screen.png
```

### Screen Recording
```bash
# Start recording
xcrun simctl io booted recordVideo ~/Desktop/recording.mp4

# Stop with Ctrl+C
```

## Automated QA Flow

### Full Device Matrix
```bash
#!/bin/bash
# qa-all-devices.sh

devices=("iPhone SE (3rd generation)" "iPhone 15 Pro" "iPhone 15 Pro Max" "iPad Pro (12.9-inch)")

for device in "${devices[@]}"; do
  echo "Testing on $device..."

  # Boot device
  xcrun simctl boot "$device" 2>/dev/null

  # Wait for boot
  sleep 5

  # Install app
  xcrun simctl install "$device" build/MyApp.app

  # Launch app
  xcrun simctl launch "$device" com.example.myapp

  # Wait for launch
  sleep 3

  # Screenshot
  xcrun simctl io "$device" screenshot "qa-${device// /_}.png"

  # Shutdown
  xcrun simctl shutdown "$device"
done

echo "Screenshots ready for review!"
```

### Dark Mode Toggle
```bash
# Enable dark mode
xcrun simctl ui booted appearance dark
xcrun simctl io booted screenshot dark-mode.png

# Enable light mode
xcrun simctl ui booted appearance light
xcrun simctl io booted screenshot light-mode.png
```

### Accessibility Testing
```bash
# Enable VoiceOver (for testing)
# Note: VoiceOver testing best done manually

# Increase text size
xcrun simctl ui booted content_size extra-large
xcrun simctl io booted screenshot large-text.png

# Reset
xcrun simctl ui booted content_size default
```

## Deep Linking for Navigation

```bash
# Open specific screen via URL scheme
xcrun simctl openurl booted "myapp://settings"
sleep 2
xcrun simctl io booted screenshot settings-screen.png

xcrun simctl openurl booted "myapp://profile"
sleep 2
xcrun simctl io booted screenshot profile-screen.png
```

## Push Notification Testing

```bash
# Send test notification
cat > notification.apns << EOF
{
  "aps": {
    "alert": {
      "title": "Test Notification",
      "body": "This is a test"
    }
  }
}
EOF

xcrun simctl push booted com.example.myapp notification.apns
```

## Integration with Claude Vision

After capturing screenshots, drag them into Claude Code and ask:
- "Review this iOS screen for UI issues"
- "Is this following iOS Human Interface Guidelines?"
- "Check accessibility - text size, contrast, touch targets"

## Common Issues to Check
- [ ] Safe area handling (notch, home indicator)
- [ ] Keyboard appearance handling
- [ ] Landscape orientation
- [ ] Dynamic Type sizes
- [ ] Dark mode colors
- [ ] VoiceOver labels

Use when: iOS visual testing, device matrix testing, automated screenshot capture

Overview

This skill automates iOS Simulator control, screenshot capture, UI testing, and visual QA for SwiftUI apps using simctl and CLI workflows. It focuses on fast device-matrix runs, dark/light mode checks, accessibility toggles, deep-link navigation, and integration with image-review tools. Use it to generate consistent visual artifacts for review and automated reporting.

How this skill works

The skill uses xcrun simctl commands to list, boot, install, launch, record, and capture screenshots from simulators. Scripts iterate a device matrix, toggle appearance and accessibility settings, open URL schemes for targeted screens, and push test notifications. Captured images are reviewed by a vision/QA tool to detect UI issues and accessibility regressions.

When to use it

  • Run automated visual checks across multiple simulator types and sizes.
  • Capture dark mode and dynamic type variations for UI review.
  • Smoke-test navigation flows via deep links before a build release.
  • Validate accessibility states (large text, VoiceOver basics) quickly.
  • Collect screenshots or recordings for bug reports and design review.

Best practices

  • Maintain a small, representative device matrix (phone sizes and an iPad) to balance coverage and speed.
  • Boot simulators once per run, install and launch the app, then capture screenshots programmatically.
  • Use deterministic waits and health checks instead of fixed sleeps where possible.
  • Store artifacts with consistent filenames that include device, mode, and timestamp.
  • Reset or shutdown simulators after runs to avoid state leakage between tests.

Example use cases

  • Full-device nightly job that boots 4 simulators, launches the app, and stores screenshots for visual diffs.
  • Dark/light mode validation script that toggles appearance and captures each screen for designer signoff.
  • Deep-link navigation test that opens specific features (settings, profile) and snapshots expected states.
  • Accessibility smoke checks that increase content size, capture screens, and flag layout breaks.
  • Push-notification simulation to verify in-app handling and notification UI with screenshots.

FAQ

Do I need Xcode test frameworks to use this?

No. This approach uses simctl CLI commands so you can automate visual QA without writing XCTest cases.

How do I test VoiceOver or interaction-heavy flows?

VoiceOver is best validated manually or with dedicated accessibility tooling; simctl can toggle settings and capture screenshots but not full spoken feedback automation.