home / skills / jeremylongshore / claude-code-plugins-plus-skills / openrouter-openai-compat

This skill helps migrate OpenAI integrations to OpenRouter quickly by providing drop-in compatibility guidance and implementation steps.

npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill openrouter-openai-compat

Review the files below or copy the command above to add this skill to your agents.

Files (6)
SKILL.md
1.7 KB
---
name: openrouter-openai-compat
description: |
  Configure OpenRouter as an OpenAI API drop-in replacement. Use when migrating from OpenAI or using OpenAI-compatible libraries. Trigger with phrases like 'openrouter openai', 'openrouter drop-in', 'openrouter compatibility', 'migrate to openrouter'.
allowed-tools: Read, Write, Edit, Grep
version: 1.0.0
license: MIT
author: Jeremy Longshore <[email protected]>
---

# Openrouter Openai Compat

## Overview

This skill demonstrates how to use OpenRouter with any OpenAI-compatible library or codebase with minimal changes.

## Prerequisites

- Existing OpenAI integration or familiarity with OpenAI API
- OpenRouter API key

## Instructions

Follow these steps to implement this skill:

1. **Verify Prerequisites**: Ensure all prerequisites listed above are met
2. **Review the Implementation**: Study the code examples and patterns below
3. **Adapt to Your Environment**: Modify configuration values for your setup
4. **Test the Integration**: Run the verification steps to confirm functionality
5. **Monitor in Production**: Set up appropriate logging and monitoring

## Output

Successful execution produces:
- Working OpenRouter integration
- Verified API connectivity
- Example responses demonstrating functionality

## Error Handling

See `{baseDir}/references/errors.md` for comprehensive error handling.

## Examples

See `{baseDir}/references/examples.md` for detailed examples.

## Resources

- [OpenRouter Documentation](https://openrouter.ai/docs)
- [OpenRouter Models](https://openrouter.ai/models)
- [OpenRouter API Reference](https://openrouter.ai/docs/api-reference)
- [OpenRouter Status](https://status.openrouter.ai)

Overview

This skill shows how to configure OpenRouter as a drop-in replacement for the OpenAI API so existing OpenAI-based code and libraries work with minimal changes. It guides you through prerequisites, configuration adjustments, testing, and monitoring to validate the integration in development and production.

How this skill works

It maps OpenAI-compatible endpoints, headers, and payload shapes to OpenRouter by supplying an OpenRouter API key and endpoint in place of OpenAI credentials. The skill provides patterns and small code changes that preserve existing client calls while redirecting traffic to OpenRouter and validating responses. It also recommends verification steps and production monitoring to ensure parity.

When to use it

  • Migrating services or apps from OpenAI to OpenRouter
  • Using OpenAI-compatible libraries that you do not want to rewrite
  • Testing alternative model providers without changing client code
  • Consolidating API billing or governance under OpenRouter
  • Prototyping multi-provider fallbacks for reliability

Best practices

  • Keep the existing OpenAI client code and override only the base URL and API key to minimize code churn
  • Verify model name compatibility and adjust prompts for model differences (temperature, token limits)
  • Add automated tests that compare OpenRouter responses to OpenAI baseline outputs
  • Enable request logging and metrics for latency, error rates, and model outputs
  • Implement retry and exponential backoff for transient network or rate-limit errors

Example use cases

  • Replace OpenAI credentials with OpenRouter in a web app using an OpenAI SDK and test end-to-end responses
  • Configure CI tests to validate prompt outputs after swapping providers
  • Run dual calls to OpenAI and OpenRouter for A/B comparison of model outputs
  • Use OpenRouter in server-side batch jobs that previously used OpenAI to reduce vendor lock-in
  • Integrate OpenRouter into a microservice that expects OpenAI-compatible request/response shapes

FAQ

Do I need to change SDKs when switching to OpenRouter?

No, most OpenAI-compatible SDKs work if you only update the base URL and API key. Change minimal configuration rather than swapping libraries.

How do I handle model name differences?

Verify model availability on OpenRouter and adjust names and prompt parameters. Run small verification prompts to compare behavior and tune temperature or max_tokens as needed.