home / skills / xfstudio / skills / screen-reader-testing
This skill helps validate screen reader compatibility and debug accessibility by guiding practical testing steps for VoiceOver, NVDA, and JAWS.
npx playbooks add skill xfstudio/skills --skill screen-reader-testingReview the files below or copy the command above to add this skill to your agents.
---
name: screen-reader-testing
description: Test web applications with screen readers including VoiceOver, NVDA, and JAWS. Use when validating screen reader compatibility, debugging accessibility issues, or ensuring assistive technology support.
---
# Screen Reader Testing
Practical guide to testing web applications with screen readers for comprehensive accessibility validation.
## Use this skill when
- Validating screen reader compatibility
- Testing ARIA implementations
- Debugging assistive technology issues
- Verifying form accessibility
- Testing dynamic content announcements
- Ensuring navigation accessibility
## Do not use this skill when
- The task is unrelated to screen reader testing
- You need a different domain or tool outside this scope
## Instructions
- Clarify goals, constraints, and required inputs.
- Apply relevant best practices and validate outcomes.
- Provide actionable steps and verification.
- If detailed examples are required, open `resources/implementation-playbook.md`.
## Resources
- `resources/implementation-playbook.md` for detailed patterns and examples.
This skill tests web applications with screen readers including VoiceOver, NVDA, and JAWS to validate real-world assistive technology behavior. It helps identify accessibility gaps in ARIA usage, form controls, dynamic updates, and navigation semantics. Use it to reproduce issues, confirm fixes, and produce actionable verification steps.
The skill walks through targeted checks using each screen reader on supported platforms, describing interactions, expected output, and common failure patterns. It inspects ARIA roles, live regions, focus management, semantic HTML, and keyboard navigation while guiding testers to reproduce and log observations. It also suggests debugging steps for common assistive-technology mismatches and how to verify fixes.
Which screen readers should I prioritize?
Prioritize NVDA (Windows), JAWS (Windows), and VoiceOver (macOS/iOS). Choose the ones your users actually use, but test across platforms for broader coverage.
Can automated tools replace screen reader testing?
Automated tools catch many issues but cannot simulate the spoken output, keyboard focus experience, or nuanced assistive-technology behavior. Use them together with manual screen reader testing.