home / skills / prowler-cloud / prowler / ai-sdk-5

ai-sdk-5 skill

/skills/ai-sdk-5

This skill helps you migrate to AI SDK 5 patterns and implement chat, streaming, and UIMessage structures for robust AI features.

npx playbooks add skill prowler-cloud/prowler --skill ai-sdk-5

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
5.5 KB
---
name: ai-sdk-5
description: >
  Vercel AI SDK 5 patterns.
  Trigger: When building AI features with AI SDK v5 (chat, streaming, tools/function calling, UIMessage parts), including migration from v4.
license: Apache-2.0
metadata:
  author: prowler-cloud
  version: "1.0"
  scope: [root, ui]
  auto_invoke: "Building AI chat features"
allowed-tools: Read, Edit, Write, Glob, Grep, Bash, WebFetch, WebSearch, Task
---

## Breaking Changes from AI SDK 4

```typescript
// ❌ AI SDK 4 (OLD)
import { useChat } from "ai";
const { messages, handleSubmit, input, handleInputChange } = useChat({
  api: "/api/chat",
});

// ✅ AI SDK 5 (NEW)
import { useChat } from "@ai-sdk/react";
import { DefaultChatTransport } from "ai";

const { messages, sendMessage } = useChat({
  transport: new DefaultChatTransport({ api: "/api/chat" }),
});
```

## Client Setup

```typescript
import { useChat } from "@ai-sdk/react";
import { DefaultChatTransport } from "ai";
import { useState } from "react";

export function Chat() {
  const [input, setInput] = useState("");

  const { messages, sendMessage, isLoading, error } = useChat({
    transport: new DefaultChatTransport({ api: "/api/chat" }),
  });

  const handleSubmit = (e: React.FormEvent) => {
    e.preventDefault();
    if (!input.trim()) return;
    sendMessage({ text: input });
    setInput("");
  };

  return (
    <div>
      <div>
        {messages.map((message) => (
          <Message key={message.id} message={message} />
        ))}
      </div>

      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Type a message..."
          disabled={isLoading}
        />
        <button type="submit" disabled={isLoading}>
          Send
        </button>
      </form>

      {error && <div>Error: {error.message}</div>}
    </div>
  );
}
```

## UIMessage Structure (v5)

```typescript
// ❌ Old: message.content was a string
// ✅ New: message.parts is an array

interface UIMessage {
  id: string;
  role: "user" | "assistant" | "system";
  parts: MessagePart[];
}

type MessagePart =
  | { type: "text"; text: string }
  | { type: "image"; image: string }
  | { type: "tool-call"; toolCallId: string; toolName: string; args: unknown }
  | { type: "tool-result"; toolCallId: string; result: unknown };

// Extract text from parts
function getMessageText(message: UIMessage): string {
  return message.parts
    .filter((part): part is { type: "text"; text: string } => part.type === "text")
    .map((part) => part.text)
    .join("");
}

// Render message
function Message({ message }: { message: UIMessage }) {
  return (
    <div className={message.role === "user" ? "user" : "assistant"}>
      {message.parts.map((part, index) => {
        if (part.type === "text") {
          return <p key={index}>{part.text}</p>;
        }
        if (part.type === "image") {
          return <img key={index} src={part.image} alt="" />;
        }
        return null;
      })}
    </div>
  );
}
```

## Server-Side (Route Handler)

```typescript
// app/api/chat/route.ts
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai("gpt-4o"),
    messages,
    system: "You are a helpful assistant.",
  });

  return result.toDataStreamResponse();
}
```

## With LangChain

```typescript
// app/api/chat/route.ts
import { toUIMessageStream } from "@ai-sdk/langchain";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, AIMessage } from "@langchain/core/messages";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const model = new ChatOpenAI({
    modelName: "gpt-4o",
    streaming: true,
  });

  // Convert UI messages to LangChain format
  const langchainMessages = messages.map((m) => {
    const text = m.parts
      .filter((p) => p.type === "text")
      .map((p) => p.text)
      .join("");
    return m.role === "user"
      ? new HumanMessage(text)
      : new AIMessage(text);
  });

  const stream = await model.stream(langchainMessages);

  return toUIMessageStream(stream).toDataStreamResponse();
}
```

## Streaming with Tools

```typescript
import { openai } from "@ai-sdk/openai";
import { streamText, tool } from "ai";
import { z } from "zod";

const result = await streamText({
  model: openai("gpt-4o"),
  messages,
  tools: {
    getWeather: tool({
      description: "Get weather for a location",
      parameters: z.object({
        location: z.string().describe("City name"),
      }),
      execute: async ({ location }) => {
        // Fetch weather data
        return { temperature: 72, condition: "sunny" };
      },
    }),
  },
});
```

## useCompletion (Text Generation)

```typescript
import { useCompletion } from "@ai-sdk/react";
import { DefaultCompletionTransport } from "ai";

const { completion, complete, isLoading } = useCompletion({
  transport: new DefaultCompletionTransport({ api: "/api/complete" }),
});

// Trigger completion
await complete("Write a haiku about");
```

## Error Handling

```typescript
const { error, messages, sendMessage } = useChat({
  transport: new DefaultChatTransport({ api: "/api/chat" }),
  onError: (error) => {
    console.error("Chat error:", error);
    toast.error("Failed to send message");
  },
});

// Display error
{error && (
  <div className="error">
    {error.message}
    <button onClick={() => sendMessage({ text: lastInput })}>
      Retry
    </button>
  </div>
)}
```

Overview

This skill documents practical patterns for building AI features with the Vercel AI SDK v5. It explains migration from v4, the new UIMessage parts model, client and server setup, streaming with tools, LangChain integration, and completion hooks. Follow these patterns to upgrade chat, streaming, and function-calling flows reliably.

How this skill works

It shows how the v5 client uses transports (DefaultChatTransport / DefaultCompletionTransport) and exposes hooks like useChat and useCompletion with sendMessage/complete semantics. Messages are now UIMessage objects with parts (text, image, tool-call, tool-result) instead of single content strings. Server routes stream responses using streamText or LangChain adapters and can register typed tools for streaming tool invocations.

When to use it

  • Migrating an existing app from AI SDK v4 to v5.
  • Building chat UI with streaming assistant responses and retry/error UX.
  • Adding tool/function calling with typed parameters and live tool results.
  • Integrating LangChain models while preserving UIMessage streams.
  • Implementing text-only completions using useCompletion and DefaultCompletionTransport.

Best practices

  • Adopt DefaultChatTransport or DefaultCompletionTransport for explicit transport configuration instead of legacy useChat/api props.
  • Treat messages as arrays of parts: extract text with a helper that joins text parts and render images and tool outputs separately.
  • Register tools with explicit Zod schemas and descriptions to enable safe, typed tool calling during streaming.
  • Stream on the server using streamText or toUIMessageStream for LangChain to preserve parts and incremental updates.
  • Implement onError and UI retry flows; capture last user input for quick resubmits and show clear error messages.

Example use cases

  • Simple chat UI: use useChat with DefaultChatTransport, map messages to Message components rendering parts.
  • Streaming tool-enabled assistant: server registers tools (with zod) and uses streamText so tool calls and results appear incrementally in the UI.
  • LangChain adapter: convert UIMessage parts to HumanMessage/AIMessage for LangChain then return toUIMessageStream to the client.
  • Text generation endpoint: use useCompletion with DefaultCompletionTransport and call complete(prompt) to get single-shot completions.
  • Robust error handling: supply onError to useChat and show retry button that re-sends the last input.

FAQ

How do I migrate messages from v4 to v5?

Convert message.content strings into message.parts arrays. Use text parts for plain text and join text parts when you need concatenated text for models or LangChain.

How are tool calls represented in messages?

Tool calls and results appear as parts with types 'tool-call' and 'tool-result' including a toolCallId, toolName, args, and the result object. Use these to correlate calls and incremental tool outputs.