home / skills / jeremylongshore / claude-code-plugins-plus-skills / openrouter-streaming-setup
/plugins/saas-packs/openrouter-pack/skills/openrouter-streaming-setup
This skill enables real-time streaming with OpenRouter to reduce latency in chat interfaces and improve user responsiveness.
npx playbooks add skill jeremylongshore/claude-code-plugins-plus-skills --skill openrouter-streaming-setupReview the files below or copy the command above to add this skill to your agents.
---
name: openrouter-streaming-setup
description: |
Implement streaming responses with OpenRouter. Use when building real-time chat interfaces or reducing time-to-first-token. Trigger with phrases like 'openrouter streaming', 'openrouter sse', 'stream response', 'real-time openrouter'.
allowed-tools: Read, Write, Edit, Grep
version: 1.0.0
license: MIT
author: Jeremy Longshore <[email protected]>
---
# Openrouter Streaming Setup
## Overview
This skill demonstrates streaming response implementation for lower perceived latency and real-time output display.
## Prerequisites
- OpenRouter integration
- Frontend capable of handling SSE/streaming
## Instructions
Follow these steps to implement this skill:
1. **Verify Prerequisites**: Ensure all prerequisites listed above are met
2. **Review the Implementation**: Study the code examples and patterns below
3. **Adapt to Your Environment**: Modify configuration values for your setup
4. **Test the Integration**: Run the verification steps to confirm functionality
5. **Monitor in Production**: Set up appropriate logging and monitoring
## Output
Successful execution produces:
- Working OpenRouter integration
- Verified API connectivity
- Example responses demonstrating functionality
## Error Handling
See `{baseDir}/references/errors.md` for comprehensive error handling.
## Examples
See `{baseDir}/references/examples.md` for detailed examples.
## Resources
- [OpenRouter Documentation](https://openrouter.ai/docs)
- [OpenRouter Models](https://openrouter.ai/models)
- [OpenRouter API Reference](https://openrouter.ai/docs/api-reference)
- [OpenRouter Status](https://status.openrouter.ai)
This skill implements streaming responses using OpenRouter to reduce time-to-first-token and enable real-time chat interfaces. It provides a clear integration pattern and practical steps for connecting a backend to an SSE-capable frontend. Use it to deliver progressive model output to users while the full response is still being generated.
The skill configures an OpenRouter client to request streamed output and exposes server endpoints that forward chunked events to the frontend via Server-Sent Events (SSE) or similar streaming transports. It includes verification steps to confirm API connectivity, sample request/response flows, and guidance for adapting configuration values to your environment. Error and connection handling patterns are provided so the stream remains robust in production.
What frontend transports are supported?
Use Server-Sent Events (SSE), fetch streaming with ReadableStream, or WebSockets. Choose the one that matches your frontend stack and connection requirements.
How do I handle model errors mid-stream?
Emit a final error event, close the stream cleanly, and provide a short explanatory message. Log details on the server and offer a retry path for the client.