home / skills / openclaw / skills / query-optimizer

query-optimizer skill

/skills/lxgicstudios/query-optimizer

This skill analyzes SQL and Prisma queries, provides index recommendations, and rewrites to boost database performance.

npx playbooks add skill openclaw/skills --skill query-optimizer

Review the files below or copy the command above to add this skill to your agents.

Files (8)
SKILL.md
2.5 KB
---
name: query-optimizer
description: Optimize SQL and Prisma queries using AI. Use when your queries are slow and you need performance help.
---

# Query Optimizer

Slow queries killing your app? Paste your SQL or Prisma code and get optimization suggestions. Index recommendations, query rewrites, N+1 detection. The stuff that takes hours to figure out manually.

**One command. Zero config. Just works.**

## Quick Start

```bash
npx ai-query-optimize "SELECT * FROM users WHERE email LIKE '%@gmail.com'"
```

## What It Does

- Analyzes SQL and Prisma queries for performance issues
- Suggests missing indexes with CREATE INDEX statements
- Rewrites queries to avoid common antipatterns
- Detects N+1 problems in ORM code
- Explains why changes improve performance

## Usage Examples

```bash
# Optimize a SQL query
npx ai-query-optimize "SELECT * FROM orders WHERE created_at > NOW() - INTERVAL '30 days'"

# Analyze a Prisma query file
npx ai-query-optimize queries.ts

# Check for N+1 issues
npx ai-query-optimize src/api/users.ts --check-n-plus-one

# Get index recommendations
npx ai-query-optimize schema.sql --suggest-indexes
```

## Best Practices

- **Include your schema** - Context about tables and existing indexes helps a lot
- **Measure before and after** - EXPLAIN ANALYZE doesn't lie
- **Test with real data** - Optimizations that work on 100 rows might fail on 1M
- **Don't optimize prematurely** - Fix actual slow queries, not theoretical ones

## When to Use This

- Database queries are showing up in your APM as slow
- Users are complaining about loading times
- You inherited a codebase with questionable query patterns
- Learning SQL optimization and want to understand the patterns

## Part of the LXGIC Dev Toolkit

This is one of 110+ free developer tools built by LXGIC Studios. No paywalls, no sign-ups, no API keys on free tiers. Just tools that work.

**Find more:**
- GitHub: https://github.com/LXGIC-Studios
- Twitter: https://x.com/lxgicstudios
- Substack: https://lxgicstudios.substack.com
- Website: https://lxgic.dev

## Requirements

No install needed. Just run with npx. Node.js 18+ recommended. Requires OPENAI_API_KEY environment variable.

```bash
export OPENAI_API_KEY=sk-...
npx ai-query-optimize --help
```

## How It Works

Parses your query or code file to understand the data access patterns. Applies database optimization knowledge to identify issues like missing indexes, expensive operations (LIKE with leading wildcards), and ORM antipatterns. Provides specific, actionable fixes.

## License

MIT. Free forever. Use it however you want.

Overview

This skill optimizes SQL and Prisma queries using AI to find performance bottlenecks and produce concrete fixes. It generates index recommendations, rewrites expensive queries, and detects common ORM antipatterns like N+1 queries. Use it to get actionable changes you can test and deploy quickly.

How this skill works

The tool parses SQL statements or Prisma/ORM code to map data access patterns and existing schema context. It applies database optimization rules to flag costly operations (full-table scans, leading-wildcard LIKE, missing indexes) and suggests specific CREATE INDEX statements or rewritten queries. For ORM code it detects N+1 patterns and offers code-level fixes, plus explanations for why each recommendation improves performance.

When to use it

  • Your APM or logs show slow database queries affecting user experience
  • You inherit a codebase with unclear or inefficient query patterns
  • You need index recommendations tailored to your schema
  • You want to find and fix N+1 problems in ORM code
  • You need quick, testable suggestions before deeper profiling

Best practices

  • Provide your schema or sample DDL to improve recommendation accuracy
  • Measure performance before and after using EXPLAIN ANALYZE or equivalent
  • Test optimizations on realistic data volumes, not just small samples
  • Focus on actual slow queries instead of optimizing everything prematurely
  • Apply changes in a staging environment and monitor query plans

Example use cases

  • Optimize a slow SELECT that scans the whole table by suggesting an index
  • Rewrite queries that use leading-wildcard LIKE into more selective predicates
  • Analyze a Prisma file and detect N+1 loops returning many small queries
  • Generate CREATE INDEX statements tailored to your WHERE and JOIN patterns
  • Quickly vet a new query for scalability before merging into main

FAQ

Do I need to install anything?

No install is required; the tool runs via npx and works with Node.js 18+. Set your API key as an environment variable.

Will it change my database automatically?

No. It produces recommendations and SQL statements for you to review and apply manually after testing.