home / skills / andrelandgraf / fullstackrecipes / ai-sdk-setup
This skill helps you install the Vercel AI SDK and build a streaming chat interface using useChat, boosting rapid AI-enabled UI development.
npx playbooks add skill andrelandgraf/fullstackrecipes --skill ai-sdk-setupReview the files below or copy the command above to add this skill to your agents.
---
name: ai-sdk-setup
description: Install the Vercel AI SDK with AI Elements components. Build a streaming chat interface with the useChat hook.
---
# AI SDK & Simple Chat
To set up AI SDK & Simple Chat, refer to the fullstackrecipes MCP server resource:
**Resource URI:** `recipe://fullstackrecipes.com/ai-sdk-setup`
If the MCP server is not configured, fetch the recipe directly:
```bash
curl -H "Accept: text/plain" https://fullstackrecipes.com/api/recipes/ai-sdk-setup
```
This skill guides you through installing the Vercel AI SDK and AI Elements components, then building a streaming chat interface using the useChat hook. It focuses on a minimal, production-ready pattern that integrates with Shadcn UI primitives and common full-stack deployment flows. Follow the steps to get a streaming, incremental chat UI with server-side handlers and client hooks.
Install the Vercel AI SDK and the AI Elements package, wire up the SDK on the server to handle model calls, and use the client-side useChat hook to manage streaming messages and UI state. The hook provides event hooks for tokens and partial responses so the interface updates incrementally as the model streams. Recipes include server endpoints (or MCP links) and component examples compatible with TypeScript and Shadcn styling.
Do I need a specific hosting provider?
No. The SDK works anywhere you can host a server endpoint, but Vercel offers optimized patterns and edge functions that align with the SDK.
How do I protect my API key?
Always keep keys on the server. Use environment variables and server-only endpoints for model requests; never expose keys in client bundles.