home / skills / multiversx / mx-ai-skills / fix_verification

fix_verification skill

/antigravity/skills/fix_verification

This skill verifies that a reported vulnerability is fully fixed by reproducing, applying the fix, and validating no regressions.

npx playbooks add skill multiversx/mx-ai-skills --skill fix_verification

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
902 B
---
name: fix_verification
description: Verifying if a reported bug is truly fixed.
---

# Fix Verification

This skill helps you rigorous verify that a reported vulnerability has been eliminated without introducing regressions.

## 1. The Verification Loop
1.  **Reproduce**: Create a `mandos` scenario that fails (demonstrates the bug).
2.  **Apply Fix**: Modification to Rust code.
3.  **Verify**: Run the failing mandos. It MUST pass now.
4.  **Regression Check**: Run ALL other mandos. They MUST still pass.

## 2. Common Fix Failures
- **Partial Fix**: Fixing one path but missing a variant (e.g., fixed `deposit` but not `transfer`).
- **Moved Bug**: The fix prevents the exploit but creates a DoS vector (e.g., adding a lock that never unlocks).

## 3. Deliverable
A "Verification Report" stating:
- Commit ID of the fix.
- Test case used to verify.
- Confirmation of regression suite success.

Overview

This skill guides a rigorous process to verify that a reported bug or vulnerability is truly fixed and that no regressions were introduced. It focuses on reproducing the failure, applying the fix, confirming the failing test now passes, and validating the full regression suite. The outcome is a concise Verification Report that documents confidence in the fix.

How this skill works

Start by reproducing the bug with a deterministic failing test scenario (for example, a mandos file that demonstrates the exploit). Apply the code change and rerun that failing scenario; it must pass. Next run the entire regression suite to ensure no other tests are broken. Produce a Verification Report that records the fix commit, the exact test used, and the regression results.

When to use it

  • After a security report or bug ticket that includes reproduction steps
  • When a failing automated test or scenario demonstrates a vulnerability
  • Before merging a security or correctness fix to mainline branches
  • When a fix is complex and may affect multiple code paths
  • Before releasing a patch or public advisory

Best practices

  • Always codify the failure as an automated, deterministic test scenario before changing code
  • Verify the minimal failing case first, then the full suite to catch regressions
  • Include negative and variant cases to avoid partial fixes that miss alternate paths
  • Avoid fixes that change system-wide invariants without additional tests (to prevent moved-bug DoS vectors)
  • Record the exact commit, test file, runtime environment, and command used in the Verification Report

Example use cases

  • A reproduced exploit in a transaction flow is encoded as a mandos scenario, fixed, then validated against the full mandos suite
  • A concurrency bug is captured in a deterministic test, patched, and the entire test harness is rerun to confirm no new deadlocks
  • A validation bypass reported in production is converted to an automated failing test, fixed, and regression-tested across versions
  • A dependency upgrade introduces a change; the failing integration test is adjusted, then the whole regression suite is executed
  • A partial fix is discovered during code review; write additional tests for uncovered variants and run the full suite before approving

FAQ

What belongs in the Verification Report?

Include the fix commit ID, the exact test scenario file and invocation, environment details, and the regression suite results (pass/fail and logs).

What if the failing test still fails after the change?

Revert and re-evaluate the fix approach: confirm the test reproduces the original bug, inspect code paths you may have missed, and add targeted tests for uncovered variants.