home / skills / daymade / claude-code-skills / video-comparer

video-comparer skill

/video-comparer

This skill compares two videos and generates an interactive HTML report with PSNR/SSIM metrics and frame-by-frame visual comparisons.

npx playbooks add skill daymade/claude-code-skills --skill video-comparer

Review the files below or copy the command above to add this skill to your agents.

Files (8)
SKILL.md
5.3 KB
---
name: video-comparer
description: This skill should be used when comparing two videos to analyze compression results or quality differences. Generates interactive HTML reports with quality metrics (PSNR, SSIM) and frame-by-frame visual comparisons. Triggers when users mention "compare videos", "video quality", "compression analysis", "before/after compression", or request quality assessment of compressed videos.
---

# Video Comparer

## Overview

Compare two videos and generate an interactive HTML report analyzing compression results. The script extracts video metadata, calculates quality metrics (PSNR, SSIM), and creates frame-by-frame visual comparisons with three viewing modes: slider, side-by-side, and grid.

## When to Use This Skill

Use this skill when:
- Comparing original and compressed videos
- Analyzing video compression quality and efficiency
- Evaluating codec performance or bitrate reduction impact
- Users mention "compare videos", "video quality", "compression analysis", or "before/after compression"

## Core Usage

### Basic Command

```bash
python3 scripts/compare.py original.mp4 compressed.mp4
```

Generates `comparison.html` with:
- Video parameters (codec, resolution, bitrate, duration, file size)
- Quality metrics (PSNR, SSIM, size/bitrate reduction percentages)
- Frame-by-frame comparison (default: frames at 5s intervals)

### Command Options

```bash
# Custom output file
python3 scripts/compare.py original.mp4 compressed.mp4 -o report.html

# Custom frame interval (larger = fewer frames, faster processing)
python3 scripts/compare.py original.mp4 compressed.mp4 --interval 10

# Batch comparison
for original in originals/*.mp4; do
    compressed="compressed/$(basename "$original")"
    output="reports/$(basename "$original" .mp4).html"
    python3 scripts/compare.py "$original" "$compressed" -o "$output"
done
```

## Requirements

### System Dependencies

**FFmpeg and FFprobe** (required for video analysis and frame extraction):

```bash
# macOS
brew install ffmpeg

# Ubuntu/Debian
sudo apt update && sudo apt install ffmpeg

# Windows
# Download from https://ffmpeg.org/download.html
# Or use: winget install ffmpeg
```

**Python 3.8+** (uses type hints, f-strings, pathlib)

### Video Specifications

- **Supported formats:** `.mp4` (recommended), `.mov`, `.avi`, `.mkv`, `.webm`
- **File size limit:** 500MB per video (configurable)
- **Processing time:** ~1-2 minutes for typical videos; varies by duration and frame interval

## Script Behavior

### Automatic Validation

The script automatically validates:
- FFmpeg/FFprobe installation and availability
- File existence, extensions, and size limits
- Path security (prevents directory traversal)

Clear error messages with resolution guidance appear when validation fails.

### Quality Metrics

The script calculates two standard quality metrics:

**PSNR (Peak Signal-to-Noise Ratio):** Pixel-level similarity measurement (20-50 dB scale, higher is better)

**SSIM (Structural Similarity Index):** Perceptual similarity measurement (0.0-1.0 scale, higher is better)

For detailed interpretation scales and quality thresholds, consult `references/video_metrics.md`.

### Frame Extraction

The script extracts frames at specified intervals (default: 5 seconds), scales them to consistent height (800px) for comparison, and embeds them as base64 data URLs in self-contained HTML. Temporary files are automatically cleaned after processing.

### Output Report

The generated HTML report includes:
- **Slider Mode**: Drag to reveal original vs compressed (default)
- **Side-by-Side Mode**: Simultaneous display for direct comparison
- **Grid Mode**: Compact 2-column layout
- **Zoom Controls**: 50%-200% magnification
- Self-contained format (no server required, works offline)

## Important Implementation Details

### Security

The script implements:
- Path validation (absolute paths, prevents directory traversal)
- Command injection prevention (no `shell=True`, validated arguments)
- Resource limits (file size, timeouts)
- Custom exceptions: `ValidationError`, `FFmpegError`, `VideoComparisonError`

### Common Error Scenarios

**"FFmpeg not found"**: Install FFmpeg via platform package manager (see Requirements section)

**"File too large"**: Compress videos before comparison, or adjust `MAX_FILE_SIZE_MB` in `scripts/compare.py`

**"Operation timed out"**: Increase `FFMPEG_TIMEOUT` constant or use larger `--interval` value (processes fewer frames)

**"Frame count mismatch"**: Videos have different durations/frame rates; script auto-truncates to minimum frame count and shows warning

## Configuration

The script includes adjustable constants for file size limits, timeouts, frame dimensions, and extraction intervals. To customize behavior, edit the constants at the top of `scripts/compare.py`. For detailed configuration options and their impacts, consult `references/configuration.md`.

## Reference Materials

Consult these files for detailed information:
- **`references/video_metrics.md`**: Quality metrics interpretation (PSNR/SSIM scales, compression targets, bitrate guidelines)
- **`references/ffmpeg_commands.md`**: FFmpeg command reference (metadata extraction, frame extraction, troubleshooting)
- **`references/configuration.md`**: Script configuration options and adjustable constants
- **`assets/template.html`**: HTML report template for customizing viewing modes and styling

Overview

This skill compares two videos and generates a self-contained, interactive HTML report to analyze compression results and visible quality differences. It extracts video metadata, computes PSNR and SSIM, and produces frame-by-frame visual comparisons with slider, side-by-side, and grid viewing modes. The report is offline-ready and includes zoom controls and aggregated quality summaries.

How this skill works

The tool validates inputs and probes each video with FFprobe, extracts frames at a configurable interval using FFmpeg, and scales frames to a consistent height for fair comparison. It computes pixel-level (PSNR) and perceptual (SSIM) metrics per frame and aggregates them across the clip. Frames are embedded as base64 images into a single HTML file and temporary files are cleaned automatically. Security checks prevent path traversal, command injection, and enforce file-size and timeout limits.

When to use it

  • When you need to compare an original video against a compressed or transcoded version
  • To evaluate codec choices, bitrate reductions, or parameter tuning impact on visual quality
  • When producing before/after reports for quality assurance or delivery sign-off
  • For automated batch comparisons across multiple assets or versions
  • When you want a portable, shareable interactive report without running a server

Best practices

  • Install and verify FFmpeg/FFprobe on the host before running comparisons
  • Use the same resolution and color space for both videos when possible to avoid misleading metrics
  • Increase frame extraction interval for long videos to reduce processing time, or lower it for critical segments
  • Keep input files under the configured size limit or adjust the constant if you control the environment
  • Run batch jobs in a controlled environment with sufficient CPU and a timeout buffer to avoid truncation

Example use cases

  • Compare a master file and its CDN-transcoded output to confirm acceptable quality loss
  • Benchmark two encoders or bitrate ladders by generating side-by-side metric summaries
  • Create a client-facing report showing visual differences after applying a new compression profile
  • Automate nightly comparisons of recently encoded assets to detect regressions
  • Generate compact QA artifacts for bug reports that include both metrics and visual evidence

FAQ

Which quality metrics are computed?

The report computes PSNR for pixel-level similarity and SSIM for perceptual similarity, and presents aggregated and per-frame values.

Do I need a web server to view the report?

No. The HTML is self-contained with embedded images and works offline in any modern browser.

What if FFmpeg isn't installed?

The script validates FFmpeg/FFprobe at startup and returns a clear error with installation guidance if they are missing.