home / skills / letta-ai / letta / db-migrations-schema-changes

db-migrations-schema-changes skill

/.skills/db-migrations-schema-changes

This skill guides managing Alembic migrations and schema changes for letta-cloud core, enabling safe Postgres or SQLite transitions and upgrades.

npx playbooks add skill letta-ai/letta --skill db-migrations-schema-changes

Review the files below or copy the command above to add this skill to your agents.

Files (3)
SKILL.md
5.3 KB
---
name: DB migrations and schema changes
description: >-
  Workflows and commands for managing Alembic database migrations and schema changes
  in the letta-cloud core app, including using uv, just, LETTA_PG_URI, and
  switching between SQLite and Postgres.
---

# DB migrations and schema changes (letta-cloud core)

Use this skill whenever you need to change the database schema or debug Alembic
migrations in `apps/core` of the letta-cloud repo.

This skill assumes:
- Working directory: `apps/core`
- Migrations: Alembic in `apps/core/alembic`
- Python runner: `uv`
- Helper: `just ready` for environment + DB setup

## Quick start

1. Ensure environment is ready:
   - `just ready`
2. For Postgres migrations, set:
   - `export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core`
3. Make your ORM/schema change.
4. Autogenerate migration:
   - `uv run alembic revision --autogenerate -m "<short_message>"`
5. Apply migration:
   - `uv run alembic upgrade head`

See `references/migration-commands.md` for exact commands and variants.

## Standard workflows

### 1. Add or modify a column (ORM-first)

1. Identify the ORM model and table.
2. Update the SQLAlchemy model in `letta/orm/...`:
   - Prefer using mixins (e.g. `ProjectMixin`) when available instead of
     duplicating columns.
3. Run `just ready` if dependencies or environment may have changed.
4. Ensure `LETTA_PG_URI` is set if you want the migration to target Postgres.
5. Autogenerate Alembic revision with `uv`.
6. Inspect the generated file under `alembic/versions/`:
   - Confirm `op.add_column` / `op.alter_column` match expectations.
7. Apply migrations with `uv run alembic upgrade head`.

Use this pattern for changes like adding `project_id` columns via `ProjectMixin`.

### 2. Data backfill / one-off data migration

1. Make sure the schema change (if any) is already represented in ORM + Alembic.
2. Create a new Alembic revision **without** autogenerate (or edit an
   autogen file) and add Python logic in `upgrade()` that:
   - Uses `op.get_bind()` and SQLAlchemy Core/SQL to backfill data.
3. Keep `downgrade()` simple and safe (ideally reversible).
4. Run against Postgres with `LETTA_PG_URI` set, using `uv run alembic upgrade head`.

### 3. Fixing a bad migration

Typical cases:
- Migration fails only on SQLite (ALTER constraint limitations).
- Migration was generated while pointing at SQLite instead of Postgres.

Workflow:
1. Identify the failing revision in `alembic/versions/`.
2. If failure is SQLite-specific, prefer running migrations against Postgres by
   exporting `LETTA_PG_URI` and re-running upgrade.
3. If logic is wrong, create a **new** migration that fixes the problem rather
   than editing an applied revision (especially in shared environments).
4. For purely local/dev history, you can delete and regenerate migrations but
   only if no one else depends on them.

See `references/sqlite-vs-postgres-gotchas.md` for SQLite-specific issues.

### 4. Switching between SQLite and Postgres

Alembic picks the engine based on `letta.settings.DatabaseChoice` and
environment variables.

General rules:
- For local dev stateful runs, `just ready` handles baseline migrations.
- For schema design and production-like migrations, prefer Postgres and set
  `LETTA_PG_URI`.

Workflow for Postgres-targeted migration:
1. `export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core`
2. From `apps/core`:
   - `uv run alembic upgrade head`
   - `uv run alembic revision --autogenerate -m "..."`

### 5. Resetting local Postgres for clean migration generation

If your local Postgres database has drifted from main (e.g., applied migrations
that no longer exist, or has stale schema), you can reset it to generate a clean
migration.

From the repo root (`/Users/sarahwooders/repos/letta-cloud`):

```bash
# 1. Remove postgres data directory
rm -rf ./data/postgres

# 2. Stop the running postgres container
docker stop $(docker ps -q --filter ancestor=ankane/pgvector)

# 3. Restart services (creates fresh postgres)
just start-services

# 4. Wait a moment for postgres to be ready, then apply all migrations
cd apps/core
export LETTA_PG_URI=postgresql+pg8000://postgres:postgres@localhost:5432/letta-core
uv run alembic upgrade head

# 5. Now generate your new migration
uv run alembic revision --autogenerate -m "your migration message"
```

This ensures the migration is generated against a clean database state matching
main, avoiding spurious diffs from local-only schema changes.

## Troubleshooting

- **"Target database is not up to date" when autogenerating**
  - First run `uv run alembic upgrade head` (with appropriate engine/URI).
- **SQLite NotImplementedError about ALTER CONSTRAINT**
  - Switch to Postgres by setting `LETTA_PG_URI` and rerun.
- **Autogenerated migration missing expected changes**
  - Ensure ORM imports and metadata (`Base.metadata`) are correct and that the
    changed model is imported in Alembic env context.
- **Autogenerated migration has unexpected drops/renames**
  - Review model changes; consider explicit operations instead of relying on
    autogenerate. Reset local Postgres (see workflow 5) to get a clean baseline.

## References

- `references/migration-commands.md` — canonical commands for `uv`, Alembic,
  and `just`.
- `references/sqlite-vs-postgres-gotchas.md` — engine-specific pitfalls and
  how to avoid them.

Overview

This skill provides practical workflows and commands for managing Alembic database migrations and schema changes in the letta-cloud core app. It focuses on using uv, just, and the LETTA_PG_URI environment variable to target SQLite or Postgres. Use it to create, apply, backfill, and fix migrations safely and reproducibly. It emphasizes targeting Postgres for production-like migrations and handling SQLite-specific limitations.

How this skill works

Workflows assume you run from apps/core with Alembic configured under alembic and Python commands executed via uv. Use just ready to prepare the environment and optionally export LETTA_PG_URI to point migrations at a local Postgres instance. Autogenerate revisions with uv run alembic revision --autogenerate and apply changes with uv run alembic upgrade head; for data backfills add Python logic to upgrade() using op.get_bind().

When to use it

  • Adding or modifying ORM columns and generating corresponding Alembic revisions.
  • Performing one-off data backfills or in-migration data transformations.
  • Fixing migrations that fail on SQLite or were generated against the wrong engine.
  • Switching migration target between local SQLite and Postgres for accurate diffs.
  • Resetting local Postgres to a clean state before generating new migrations.

Best practices

  • Prefer running autogenerate against Postgres by exporting LETTA_PG_URI to avoid SQLite ALTER limitations.
  • Inspect autogen files in alembic/versions and prefer explicit operations when autogenerate is unclear.
  • Create new corrective migrations instead of editing applied revisions in shared environments.
  • Use just ready before generating migrations to ensure environment and imports are correct.
  • Reset local Postgres when drift causes spurious diffs, then regenerate migrations against the clean DB.

Example use cases

  • Add project_id to multiple tables using an existing ProjectMixin and autogenerate the migration.
  • Implement a one-off backfill to populate a new column using op.get_bind() in upgrade().
  • Resolve a NotImplementedError from SQLite by setting LETTA_PG_URI and rerunning alembic upgrade.
  • Regenerate migrations after local schema drift by wiping local Postgres data and restarting services.
  • Verify autogenerated revisions import the changed models by checking Base.metadata in Alembic env.

FAQ

What if autogenerate reports the target DB is not up to date?

Run uv run alembic upgrade head against the intended engine (set LETTA_PG_URI for Postgres) before regenerating revisions.

Can I edit an applied migration file to fix a bug?

Avoid editing applied revisions in shared environments; create a new migration that corrects the issue. For purely local dev history you may delete and regenerate if no one else depends on them.