home / skills / louloulin / claude-agent-sdk / database-migrator

This skill helps design safe database migrations with zero-downtime, rollback-safe plans and thorough testing across PostgreSQL, MySQL, and other supported

npx playbooks add skill louloulin/claude-agent-sdk --skill database-migrator

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
2.9 KB
---
name: database-migrator
description: "Database schema migration and data transformation expert"
version: "3.2.1"
author: "Database Team <[email protected]>"
tags:
  - database
  - migration
  - sql
  - schema
dependencies:
  - sql-analyzer
  - schema-validator
---

# Database Migration Skill

You are a database migration expert. Help design safe and effective database schema migrations.

## Migration Strategy

### Principles
1. **Zero Downtime** - Migrations should not interrupt service
2. **Backward Compatible** - Old code should work during migration
3. **Rollback Safe** - Always provide a rollback path
4. **Test First** - Test migrations on staging before production

## Migration Process

### Phase 1: Planning
1. Analyze current schema
2. Identify breaking changes
3. Plan migration steps
4. Design rollback strategy
5. Estimate impact and duration

### Phase 2: Development
1. Write migration scripts
2. Write rollback scripts
3. Create test data scenarios
4. Document changes

### Phase 3: Testing
1. Test on development environment
2. Test on staging environment
3. Test rollback procedures
4. Measure performance impact
5. Verify data integrity

### Phase 4: Deployment
1. Create database backup
2. Deploy migration during low traffic
3. Monitor for errors
4. Verify application functionality
5. Keep backup for retention period

## Common Migration Patterns

### Adding Columns
```sql
-- Safe way (add with default)
ALTER TABLE users ADD COLUMN bio TEXT DEFAULT '';

-- Unsafe way (add without default on large table)
ALTER TABLE users ADD COLUMN bio TEXT;
```

### Changing Columns
```sql
-- Multi-step approach
-- Step 1: Add new column
ALTER TABLE users ADD COLUMN email_new VARCHAR(255);

-- Step 2: Backfill data
UPDATE users SET email_new = email;

-- Step 3: Update application to use new column

-- Step 4: Remove old column
ALTER TABLE users DROP COLUMN email;
```

### Indexing
```sql
-- Create index CONCURRENTLY (PostgreSQL)
CREATE INDEX CONCURRENTLY idx_users_email ON users(email);

-- For MySQL, use ONLINE DDL
ALTER TABLE users ADD INDEX idx_email (email), LOCK=NONE, ALGORITHM=INPLACE;
```

## Data Validation

After migration, verify:
- Row counts match expected
- Data types are correct
- Foreign keys are valid
- Indexes are created
- Constraints are enforced
- Application queries work

## Supported Databases

- PostgreSQL 12+
- MySQL 8.0+
- SQLite 3.x
- SQL Server 2019+
- Oracle 19c+

## Rollback Strategy

Always provide rollback scripts:
1. Drop new columns/tables
2. Restore old column definitions
3. Remove new indexes
4. Restore old data if modified
5. Undo constraint changes

## Monitoring

Monitor during and after migration:
- Query execution times
- Lock contention
- Replication lag
- Error rates
- Application performance

## Safety Checklist

- [ ] Backup created
- [ ] Rollback script prepared
- [ ] Tested on staging
- [ ] Peer review completed
- [ ] Deployment window scheduled
- [ ] Monitoring in place
- [ ] Communication plan ready

Overview

This skill is a database schema migration and data transformation expert implemented for Rust-based agents. It provides practical guidance, safe migration patterns, and step-by-step processes to design, test, deploy, and roll back schema and data changes with minimal service disruption. Its focus is on zero-downtime, backward compatibility, and verifiable data integrity.

How this skill works

The skill inspects the current schema and identifies potential breaking changes, then generates a phased migration plan including scripts and rollback procedures. It recommends multi-step, non-blocking operations (add-then-backfill, create indexes concurrently/online) and prescribes testing and monitoring steps to validate integrity and performance. It supports common relational databases and outputs concrete actions for planning, development, testing, and deployment.

When to use it

  • When changing column types, names, or nullable constraints on production tables
  • When adding large columns or indexes to high-traffic tables
  • When deploying schema changes across replicated or sharded environments
  • When introducing data transformations or backfills that require staged rollout
  • When you need a rollback-safe plan for irreversible changes

Best practices

  • Design migrations to be backward compatible so old code continues to work during rollout
  • Use multi-step changes: add new objects, backfill data, switch readers, then drop old objects
  • Prefer online/lock-free operations (CONCURRENTLY, ONLINE DDL) to avoid long locks
  • Always create backups and test rollback procedures on staging before production
  • Monitor query latency, locks, replication lag, and error rates during migration

Example use cases

  • Add a large TEXT column: add with default or nullable, backfill in batches, then set NOT NULL
  • Change email column type: add email_new, backfill, switch application to email_new, drop old column
  • Create an index on a live table using CREATE INDEX CONCURRENTLY or MySQL ONLINE DDL
  • Perform a cross-table data transformation with scripted batch updates and integrity checks
  • Roll out constraint changes by first enforcing them in application logic, then in DB

FAQ

What databases are supported?

Recommendations target PostgreSQL 12+, MySQL 8.0+, SQLite 3.x, SQL Server 2019+, and Oracle 19c+.

How do I ensure zero downtime?

Use backward-compatible multi-step changes, avoid blocking DDL, deploy during low traffic, and run health checks before switching readers/writers.