home / skills / picahq / skills / pica-vercel-ai-sdk

pica-vercel-ai-sdk skill

/skills/pica-vercel-ai-sdk

This skill helps you integrate PICA MCP with Vercel AI SDK to manage external tools and enable AI agents.

npx playbooks add skill picahq/skills --skill pica-vercel-ai-sdk

Review the files below or copy the command above to add this skill to your agents.

Files (2)
SKILL.md
6.3 KB
---
name: pica-vercel-ai-sdk
description: Integrate PICA into an application via MCP. Use when adding PICA tools, connecting PICA to an AI agent, setting up PICA MCP with Vercel AI SDK, or when the user mentions PICA.
---

# PICA MCP Integration

PICA provides a unified API platform that connects AI agents to third-party services (CRMs, email, calendars, databases, etc.) through MCP tool calling.

## PICA MCP Server

PICA exposes its capabilities through an MCP server distributed as `@picahq/mcp`. It uses **stdio transport** — it runs as a local subprocess via `npx`.

### MCP Configuration

```json
{
  "mcpServers": {
    "pica": {
      "command": "npx",
      "args": ["@picahq/mcp"],
      "env": {
        "PICA_SECRET": "your-pica-secret-key"
      }
    }
  }
}
```

- **Package**: `@picahq/mcp` (run via `npx`, no install needed)
- **Auth**: `PICA_SECRET` environment variable (obtain from the PICA dashboard https://app.picaos.com/settings/api-keys)
- **Transport**: stdio (standard input/output)

### Environment Variable

Always store the PICA secret in an environment variable, never hardcode it:

```
PICA_SECRET=sk_test_...
```

Add it to `.env.local` (or equivalent) and document it in `.env.example`.

## Using PICA with Vercel AI SDK

The Vercel AI SDK provides MCP client support via the `@ai-sdk/mcp` package. Install it:

```bash
npm add @ai-sdk/mcp
```

### CRITICAL: Version alignment

`@ai-sdk/mcp`, `ai`, `@ai-sdk/react`, `@ai-sdk/openai`, `@ai-sdk/anthropic`, and all other `@ai-sdk/*` packages **must resolve to the same `@ai-sdk/provider-utils` major version**. If they don't, tool schemas created by the MCP client use different internal symbols than those expected by `streamText`, causing validation errors like:

```
Invalid input for tool ...: Type validation failed: Value: {}.
Error message: Cannot read properties of undefined (reading 'validate')
```

**How to diagnose:** look for multiple copies of `@ai-sdk/provider-utils`:

```bash
find node_modules -path "*/provider-utils/package.json" \
  -exec sh -c 'echo "$(dirname "{}"): $(node -e "console.log(require(\"{}\").version)")"' \;
```

If you see two different versions (e.g. `3.x` at root and `4.x` nested inside `@ai-sdk/mcp/node_modules/`), **upgrade all `@ai-sdk/*` and `ai` packages together** so they share a single version:

```bash
npm install ai@latest @ai-sdk/react@latest @ai-sdk/openai@latest @ai-sdk/anthropic@latest @ai-sdk/mcp@latest
```

You may also need to upgrade `react`/`react-dom` if peer dependency constraints require it.

### Before implementing: look up the latest docs

The Vercel AI SDK MCP API may change between versions. **Always search the bundled docs first** to get the current API before writing any code. For detailed instructions on where to find the docs and what to look for, see [vercel-ai-sdk-mcp-reference.md](vercel-ai-sdk-mcp-reference.md).

### Integration pattern

1. **Create an MCP client** using stdio transport pointed at `npx @picahq/mcp`
2. **Get tools** from the client — they are automatically converted to AI SDK tool format
3. **Pass tools** to `streamText()` or `generateText()`
4. **Enable multi-step execution** with `stopWhen: stepCountIs(N)` so the model can call tools and process results
5. **Close the MCP client** when the response is finished

When passing environment variables to the stdio transport, spread `process.env` so the subprocess inherits the full environment (PATH, etc.):

```typescript
env: {
  ...process.env as Record<string, string>,
  PICA_SECRET: process.env.PICA_SECRET!,
}
```

### Working example (API route)

```typescript
import {
  streamText,
  UIMessage,
  convertToModelMessages,
  stepCountIs,
} from 'ai';
import { createMCPClient } from '@ai-sdk/mcp';
import { Experimental_StdioMCPTransport as StdioMCPTransport } from '@ai-sdk/mcp/mcp-stdio';

export async function POST(req: Request) {
  const { messages }: { messages: UIMessage[] } = await req.json();

  const mcpClient = await createMCPClient({
    transport: new StdioMCPTransport({
      command: 'npx',
      args: ['@picahq/mcp'],
      env: {
        ...process.env as Record<string, string>,
        PICA_SECRET: process.env.PICA_SECRET!,
      },
    }),
  });

  const tools = await mcpClient.tools();

  const result = streamText({
    model: yourModel,
    // convertToModelMessages is async as of [email protected]
    messages: await convertToModelMessages(messages),
    tools,
    stopWhen: stepCountIs(5),
    onFinish: async () => {
      await mcpClient.close();
    },
  });

  return result.toUIMessageStreamResponse({
    sendSources: true,
    sendReasoning: true,
  });
}
```

### UI rendering of PICA tool calls

PICA tools appear as **dynamic tool parts** in the AI SDK's `UIMessage` system (`type: "dynamic-tool"` with a `toolName` field). Standard AI SDK tools appear as `tool-invocation` (prefix `tool-`). Handle both in the message parts renderer:

```tsx
// In the message parts switch/map:
if (part.type.startsWith('tool-') || part.type === 'dynamic-tool') {
  const toolPart = part as any;
  // toolPart.toolName — the MCP tool name (e.g. "list_pica_integrations")
  // toolPart.state — "input-streaming" | "input-available" | "output-available" | "output-error"
  // toolPart.input — tool input parameters
  // toolPart.output — tool result (when state is "output-available")
}
```

Pass `toolPart.toolName` as the display title so Pica tools show their actual name rather than the generic part type.

## Checklist

When setting up PICA MCP in a project:

- [ ] `@ai-sdk/mcp` is installed
- [ ] **All `@ai-sdk/*` and `ai` packages share the same `@ai-sdk/provider-utils` version** (see Version alignment above)
- [ ] `PICA_SECRET` is set in `.env.local` (or equivalent)
- [ ] `.env.example` documents the `PICA_SECRET` variable
- [ ] MCP client uses stdio transport with `npx @picahq/mcp`
- [ ] Full `process.env` is spread into the transport's `env` option
- [ ] `convertToModelMessages` is awaited (async as of `[email protected]`)
- [ ] Multi-step execution is configured with `stopWhen: stepCountIs(N)`
- [ ] MCP client is closed after use (`onFinish` for streaming, `try/finally` for non-streaming)
- [ ] UI handles both `"tool-invocation"` and `"dynamic-tool"` parts for rendering tool calls

## Additional resources

- For Vercel AI SDK MCP specifics and where to find the latest docs, see [vercel-ai-sdk-mcp-reference.md](vercel-ai-sdk-mcp-reference.md)

Overview

This skill shows how to integrate the PICA MCP server into applications using the Vercel AI SDK. It explains creating an MCP client via stdio (npx @picahq/mcp), wiring environment secrets securely, and passing PICA tools to the AI SDK for tool calling and UI rendering. It focuses on practical steps, version alignment, and common pitfalls.

How this skill works

The integration runs the PICA MCP server locally as a subprocess using stdio transport (npx @picahq/mcp) and exposes PICA tools through the MCP client. Tools returned by the client are converted to AI SDK tool formats and supplied to streamText()/generateText(), enabling the model to call PICA tools in multi-step workflows. The MCP client must be closed after use and the process inherits environment variables including PICA_SECRET.

When to use it

  • Adding PICA tools to an AI agent that needs third-party service access
  • Connecting PICA to the Vercel AI SDK via MCP tool calling
  • Implementing multi-step agent workflows where the model calls external tools
  • Setting up a project that references PICA or when the user mentions PICA integration
  • Debugging validation errors related to mismatched @ai-sdk package versions

Best practices

  • Store PICA_SECRET in environment variables (.env.local) and never hardcode it
  • Run PICA MCP with stdio transport via npx @picahq/mcp so no global install is required
  • Spread full process.env into the transport env so PATH and other vars are preserved
  • Ensure all @ai-sdk/* and ai packages resolve to the same @ai-sdk/provider-utils major version
  • Await convertToModelMessages when using [email protected]+ and close the MCP client (onFinish or try/finally)

Example use cases

  • API route that creates an MCP client, fetches tools, and streams model output while allowing tool calls
  • Agent that lists integrations, queries a CRM via PICA tools, and summarizes results for the user
  • UI message renderer that displays dynamic PICA tool parts (dynamic-tool) and tool-invocation parts together
  • Upgrade checklist and diagnostic steps when encountering tool validation errors due to mismatched package versions

FAQ

How do I provide the PICA secret to the subprocess?

Set PICA_SECRET in your environment (.env.local) and spread process.env into the MCP transport env so the subprocess inherits it.

Why do I get validation errors like "Type validation failed"?

This usually means multiple versions of @ai-sdk/provider-utils exist. Upgrade all ai and @ai-sdk/* packages together so they share the same provider-utils major version.