home / skills / 0xdarkmatter / claude-mods / python-pytest-patterns

python-pytest-patterns skill

/skills/python-pytest-patterns

This skill analyzes Python pytest patterns to improve test reliability by illustrating fixtures, parametrization, and mocks with concrete examples.

npx playbooks add skill 0xdarkmatter/claude-mods --skill python-pytest-patterns

Review the files below or copy the command above to add this skill to your agents.

Files (12)
SKILL.md
4.9 KB
---
name: python-pytest-patterns
description: "pytest testing patterns for Python. Triggers on: pytest, fixture, mark, parametrize, mock, conftest, test coverage, unit test, integration test, pytest.raises."
compatibility: "pytest 7.0+, Python 3.9+. Some features require pytest-asyncio, pytest-mock, pytest-cov."
allowed-tools: "Read Write Bash"
depends-on: []
related-skills: [python-typing-patterns, python-async-patterns]
---

# Python pytest Patterns

Modern pytest patterns for effective testing.

## Basic Test Structure

```python
import pytest

def test_basic():
    """Simple assertion test."""
    assert 1 + 1 == 2

def test_with_description():
    """Descriptive name and docstring."""
    result = calculate_total([1, 2, 3])
    assert result == 6, "Sum should equal 6"
```

## Fixtures

```python
import pytest

@pytest.fixture
def sample_user():
    """Create test user."""
    return {"id": 1, "name": "Test User"}

@pytest.fixture
def db_connection():
    """Fixture with setup and teardown."""
    conn = create_connection()
    yield conn
    conn.close()

def test_user(sample_user):
    """Fixtures injected by name."""
    assert sample_user["name"] == "Test User"
```

### Fixture Scopes

```python
@pytest.fixture(scope="function")  # Default - per test
@pytest.fixture(scope="class")     # Per test class
@pytest.fixture(scope="module")    # Per test file
@pytest.fixture(scope="session")   # Entire test run
```

## Parametrize

```python
@pytest.mark.parametrize("input,expected", [
    (1, 2),
    (2, 4),
    (3, 6),
])
def test_double(input, expected):
    assert double(input) == expected

# Multiple parameters
@pytest.mark.parametrize("x", [1, 2])
@pytest.mark.parametrize("y", [10, 20])
def test_multiply(x, y):  # 4 test combinations
    assert x * y > 0
```

## Exception Testing

```python
def test_raises():
    with pytest.raises(ValueError) as exc_info:
        raise ValueError("Invalid input")
    assert "Invalid" in str(exc_info.value)

def test_raises_match():
    with pytest.raises(ValueError, match=r".*[Ii]nvalid.*"):
        raise ValueError("Invalid input")
```

## Markers

```python
@pytest.mark.skip(reason="Not implemented yet")
def test_future_feature():
    pass

@pytest.mark.skipif(sys.platform == "win32", reason="Unix only")
def test_unix_feature():
    pass

@pytest.mark.xfail(reason="Known bug")
def test_buggy():
    assert broken_function() == expected

@pytest.mark.slow
def test_performance():
    """Custom marker - register in pytest.ini."""
    pass
```

## Mocking

```python
from unittest.mock import Mock, patch, MagicMock

def test_with_mock():
    mock_api = Mock()
    mock_api.get.return_value = {"status": "ok"}
    result = mock_api.get("/endpoint")
    assert result["status"] == "ok"

@patch("module.external_api")
def test_with_patch(mock_api):
    mock_api.return_value = {"data": []}
    result = function_using_api()
    mock_api.assert_called_once()
```

### pytest-mock (Recommended)

```python
def test_with_mocker(mocker):
    mock_api = mocker.patch("module.api_call")
    mock_api.return_value = {"success": True}
    result = process_data()
    assert result["success"]
```

## conftest.py

```python
# tests/conftest.py - Shared fixtures

import pytest

@pytest.fixture(scope="session")
def app():
    """Application fixture available to all tests."""
    return create_app(testing=True)

@pytest.fixture
def client(app):
    """Test client fixture."""
    return app.test_client()
```

## Quick Reference

| Command | Description |
|---------|-------------|
| `pytest` | Run all tests |
| `pytest -v` | Verbose output |
| `pytest -x` | Stop on first failure |
| `pytest -k "test_name"` | Run matching tests |
| `pytest -m slow` | Run marked tests |
| `pytest --lf` | Rerun last failed |
| `pytest --cov=src` | Coverage report |
| `pytest -n auto` | Parallel (pytest-xdist) |

## Additional Resources

- `./references/fixtures-advanced.md` - Factory fixtures, autouse, conftest patterns
- `./references/mocking-patterns.md` - Mock, patch, MagicMock, side_effect
- `./references/async-testing.md` - pytest-asyncio patterns
- `./references/coverage-strategies.md` - pytest-cov, branch coverage, reports
- `./references/integration-testing.md` - Database fixtures, API testing, testcontainers
- `./references/property-testing.md` - Hypothesis framework, strategies, shrinking
- `./references/test-architecture.md` - Test pyramid, organization, isolation strategies

## Scripts

- `./scripts/run-tests.sh` - Run tests with recommended options
- `./scripts/generate-conftest.sh` - Generate conftest.py boilerplate

## Assets

- `./assets/pytest.ini.template` - Recommended pytest configuration
- `./assets/conftest.py.template` - Common fixture patterns

---

## See Also

**Related Skills:**
- `python-typing-patterns` - Type-safe test code
- `python-async-patterns` - Async test patterns (pytest-asyncio)

**Testing specific frameworks:**
- `python-fastapi-patterns` - TestClient, API testing
- `python-database-patterns` - Database fixtures, transactions

Overview

This skill packages modern pytest testing patterns for Python to help teams write reliable, maintainable tests. It collects fixture templates, parametrization techniques, exception and marker usage, mocking patterns, and conftest best practices. Use it to standardize test structure, speed up test creation, and improve coverage reporting.

How this skill works

The skill inspects common pytest constructs and provides ready-to-use patterns and snippets: fixtures with different scopes, parametrized tests, exception assertions, marker usage (skip/skipif/xfail/custom), and mocking with unittest.mock or pytest-mock. It also includes conftest conventions, quick CLI references, and scripts to run or scaffold tests. Patterns are organized so you can copy templates, adapt scopes, and register custom markers in pytest.ini.

When to use it

  • Bootstrapping a new test suite or project
  • Standardizing fixtures and shared test state via conftest.py
  • Writing parameterized or combinatorial tests to cover inputs efficiently
  • Testing error handling with pytest.raises and regex matching
  • Mocking external dependencies or patching modules during unit tests
  • Configuring markers and selective test runs for slow/integration tests

Best practices

  • Prefer fixtures over setUp/tearDown for clarity and reuse; choose scope to minimize setup cost
  • Use pytest.mark.parametrize for table-driven tests instead of loops inside tests
  • Register custom markers in pytest.ini to avoid warnings and enable selective runs
  • Keep conftest.py minimal and focused on shared fixtures; avoid heavy test logic there
  • Use pytest-mock (mocker) or unittest.mock.patch for deterministic unit tests and assert calls/side effects
  • Run tests with coverage and parallelism (pytest-cov, pytest-xdist) for faster feedback

Example use cases

  • Create a session-scoped app fixture in conftest.py for integration tests with a test client
  • Parametrize input/expected pairs to validate pure functions across many cases
  • Use pytest.raises with match to ensure specific error messages are produced
  • Patch external APIs in unit tests to simulate network responses without hitting real services
  • Mark slow or flaky tests and run them separately using pytest -m

FAQ

How do I scope fixtures to avoid expensive setup for every test?

Set the fixture scope to module, class, or session depending on reuse. Use function scope for isolated tests and session scope for shared resources like an app or DB connection.

When should I use mocker.patch vs unittest.mock.patch?

Use pytest-mock's mocker.patch for concise fixtures-friendly syntax and better integration with pytest; use unittest.mock.patch when you prefer standard-library tooling or explicit import behavior.