home / skills / openclaw / skills / hokipoki

hokipoki skill

/skills/budjoskop/hokipoki

This skill lets you route tasks to Claude, Codex, or Gemini via the HokiPoki CLI, enabling quick model hopping and second opinions.

npx playbooks add skill openclaw/skills --skill hokipoki

Review the files below or copy the command above to add this skill to your agents.

Files (3)
SKILL.md
2.9 KB
---
name: hokipoki
description: "Switch AI models without switching tabs using the HokiPoki CLI. Hop between Claude, Codex, and Gemini when one gets stuck. Use when the user wants to request help from a different AI model, hop to another AI, get a second opinion from another model, switch models, share AI subscriptions with teammates, or manage HokiPoki provider/listener mode. Triggers on: 'use codex/gemini for this', 'hop to another model', 'ask another AI', 'get a second opinion', 'switch models', 'hokipoki', 'listen for requests'."
---

# HokiPoki Skill

Route tasks to different AI CLIs (Claude, Codex, Gemini) via the HokiPoki P2P network. API keys never leave the provider's machine; only encrypted requests and results are exchanged.

## Prerequisites

HokiPoki CLI must be installed and authenticated:

```bash
npm install -g @next-halo/hokipoki-cli
hokipoki login
```

Verify with `hokipoki whoami`. If not installed, guide the user through setup.

## Requesting Help from Another AI

Send a task to a remote AI model. Always use `--json` for parseable output:

```bash
# Specific files
hokipoki request --tool claude --task "Fix the auth bug" --files src/auth.ts --json

# Entire directory
hokipoki request --tool codex --task "Add error handling" --dir src/services/ --json

# Whole project (respects .gitignore)
hokipoki request --tool gemini --task "Review for security issues" --all --json

# Route to a team workspace
hokipoki request --tool claude --task "Optimize queries" --files src/db.ts --workspace my-team --json

# Skip auto-apply (just save the patch)
hokipoki request --tool codex --task "Refactor module" --dir src/ --no-auto-apply --json
```

Tool selection: if the user doesn't specify a tool, ask which model to use or omit `--tool` to let HokiPoki choose.

### Patch Auto-Apply

Patches auto-apply when the target directory is a git repo with committed files. If auto-apply fails, inform the user and suggest:

```bash
git init && git add . && git commit -m "initial"
```

## Provider Mode (Sharing Your AI)

Register and listen for incoming requests:

```bash
# Register as a provider (one-time)
hokipoki register --as-provider --tools claude codex gemini

# Start listening
hokipoki listen --tools claude codex
```

Tasks execute in isolated Docker containers (read-only filesystem, tmpfs workspace, auto-cleanup). Docker must be running.

## Status & Account

```bash
hokipoki whoami      # Current user info
hokipoki status      # Account, workspaces, history
hokipoki dashboard   # Open web dashboard in browser
```

## When to Suggest Hopping

- User is stuck on a problem after multiple attempts
- User asks for a different approach or fresh perspective
- Task involves a domain where another model excels (e.g., Codex for boilerplate, Gemini for large-context analysis)
- User explicitly asks to try another AI

## Full Command Reference

See [references/commands.md](references/commands.md) for all CLI options, auth token locations, and advanced usage.

Overview

This skill lets you switch AI models without leaving your terminal using the HokiPoki CLI. Route tasks to remote Claude, Codex, or Gemini instances for second opinions, alternative approaches, or specialized strengths while keeping API keys local and encrypted. It supports provider/listener mode to share access with teammates securely.

How this skill works

HokiPoki routes requests over a P2P network so encrypted tasks and results flow between machines; API keys remain on the provider machine. Use the hokipoki CLI to request work (--tool claude|codex|gemini), send files or whole projects, and receive parseable JSON output. Providers can register and listen to execute tasks in isolated Docker containers and auto-apply patches when a git repo is present.

When to use it

  • You’re stuck after multiple attempts and want a fresh model perspective
  • You need a model better suited to the task (e.g., Codex for code generation, Gemini for large-context review)
  • You want a second opinion or verification from another AI
  • You need to share AI access with teammates securely via provider mode
  • You want to route tasks without exposing API keys to remote users

Best practices

  • Always include --json for parseable, reliable output
  • Specify files or directories to limit scope and speed up responses
  • If you want deterministic routing, include --tool; otherwise let HokiPoki pick
  • Ensure Docker is running before registering as a provider or starting listener mode
  • Keep repositories committed before expecting auto-apply; initialize and commit if needed

Example use cases

  • Ask Codex to add error handling to a service directory and return a patch (--dir src/services --tool codex --json)
  • Route a security review to Gemini for a full-project scan (--all --tool gemini --json)
  • Send a focused bug fix to Claude for a single file (--files src/auth.ts --tool claude --json)
  • Register as a provider to let teammates call your paid models while keeping tokens local (hokipoki register --as-provider)
  • Listen for incoming requests on specific tools to accept tasks in isolated Docker containers (hokipoki listen --tools codex gemini)

FAQ

Do API keys get shared when I act as a provider?

No. API keys stay on the provider machine. Only encrypted requests and results are exchanged over the network.

What if patch auto-apply fails?

Auto-apply requires a git repo with committed files. If it fails, run git init && git add . && git commit -m "initial" and retry; HokiPoki will also return a saved patch if --no-auto-apply is used.