home / skills / jeremylongshore / claude-code-plugins-plus-skills / test-retry-config

This skill helps you configure and validate test retry settings with production-ready guidance for unit, integration, and framework patterns.

npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill test-retry-config

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
2.1 KB
---
name: "test-retry-config"
description: |
  Test Retry Config - Auto-activating skill for Test Automation.
  Triggers on: test retry config, test retry config
  Part of the Test Automation skill category. Use when writing or running tests. Trigger with phrases like "test retry config", "test config", "test".
allowed-tools: "Read, Write, Edit, Bash(cmd:*), Grep"
version: 1.0.0
license: MIT
author: "Jeremy Longshore <[email protected]>"
---

# Test Retry Config

## Overview

This skill provides automated assistance for test retry config tasks within the Test Automation domain.

## When to Use

This skill activates automatically when you:
- Mention "test retry config" in your request
- Ask about test retry config patterns or best practices
- Need help with test automation skills covering unit testing, integration testing, mocking, and test framework configuration.

## Instructions

1. Provides step-by-step guidance for test retry config
2. Follows industry best practices and patterns
3. Generates production-ready code and configurations
4. Validates outputs against common standards

## Examples

**Example: Basic Usage**
Request: "Help me with test retry config"
Result: Provides step-by-step guidance and generates appropriate configurations


## Prerequisites

- Relevant development environment configured
- Access to necessary tools and services
- Basic understanding of test automation concepts


## Output

- Generated configurations and code
- Best practice recommendations
- Validation results


## Error Handling

| Error | Cause | Solution |
|-------|-------|----------|
| Configuration invalid | Missing required fields | Check documentation for required parameters |
| Tool not found | Dependency not installed | Install required tools per prerequisites |
| Permission denied | Insufficient access | Verify credentials and permissions |


## Resources

- Official documentation for related tools
- Best practices guides
- Community examples and tutorials

## Related Skills

Part of the **Test Automation** skill category.
Tags: testing, jest, pytest, mocking, tdd

Overview

This skill automates configuration and guidance for test retry strategies in test automation workflows. It provides step-by-step recommendations, generates production-ready retry settings for common frameworks, and validates configurations against best practices. Use it to reduce flaky test noise and improve CI stability.

How this skill works

The skill inspects your test framework, CI environment, and current retry-related settings, then suggests or generates retry policies tailored to that stack (for example, pytest, Jest, or CI pipeline YAML). It produces code snippets, configuration blocks, and validation feedback, and explains trade-offs like masking flakiness versus improving resilience.

When to use it

  • Setting up retries for a new test suite or CI pipeline
  • Tuning retry behavior to reduce flaky test failures
  • Migrating retry settings between frameworks or CI providers
  • Validating existing retry configurations against best practices
  • Generating retry-ready example configurations for code reviews

Best practices

  • Prefer targeted retries on known flaky tests instead of global retries
  • Limit retry count and add exponential backoff to avoid masking issues
  • Log retry attempts and surface failure reasons to aid debugging
  • Combine retries with isolation strategies (mocking, seeding, environment cleanup)
  • Use CI-level controls to avoid wasting runner resources during large-scale retries

Example use cases

  • Generate a pytest.ini snippet with flaky_test marker and pytest-rerunfailures settings
  • Produce a Jest retry wrapper example for flaky async tests with backoff
  • Create CI pipeline steps that conditionally re-run failed test batches
  • Validate an existing YAML pipeline retry block and suggest safer defaults
  • Recommend which tests to mark for retries based on recent failure history

FAQ

Will retries hide real test bugs?

Retries can mask intermittent issues if overused; keep retry counts low, target only flaky tests, and continue investigating recurrent failures.

Which frameworks are supported?

It covers common frameworks like pytest and Jest and produces generic CI pipeline examples; provide your framework and CI details for tailored output.