home / skills / openclaw / skills / model-usage
This skill summarizes per-model usage and costs from CodexBar for Codex or Claude, returning current or full model breakdowns to inform budgeting.
npx playbooks add skill openclaw/skills --skill model-usageReview the files below or copy the command above to add this skill to your agents.
---
name: model-usage
description: Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
metadata: {"clawdbot":{"emoji":"📊","os":["darwin"],"requires":{"bins":["codexbar"]},"install":[{"id":"brew-cask","kind":"brew","cask":"steipete/tap/codexbar","bins":["codexbar"],"label":"Install CodexBar (brew cask)"}]}}
---
# Model usage
## Overview
Get per-model usage cost from CodexBar's local cost logs. Supports "current model" (most recent daily entry) or "all models" summaries for Codex or Claude.
TODO: add Linux CLI support guidance once CodexBar CLI install path is documented for Linux.
## Quick start
1) Fetch cost JSON via CodexBar CLI or pass a JSON file.
2) Use the bundled script to summarize by model.
```bash
python {baseDir}/scripts/model_usage.py --provider codex --mode current
python {baseDir}/scripts/model_usage.py --provider codex --mode all
python {baseDir}/scripts/model_usage.py --provider claude --mode all --format json --pretty
```
## Current model logic
- Uses the most recent daily row with `modelBreakdowns`.
- Picks the model with the highest cost in that row.
- Falls back to the last entry in `modelsUsed` when breakdowns are missing.
- Override with `--model <name>` when you need a specific model.
## Inputs
- Default: runs `codexbar cost --format json --provider <codex|claude>`.
- File or stdin:
```bash
codexbar cost --provider codex --format json > /tmp/cost.json
python {baseDir}/scripts/model_usage.py --input /tmp/cost.json --mode all
cat /tmp/cost.json | python {baseDir}/scripts/model_usage.py --input - --mode current
```
## Output
- Text (default) or JSON (`--format json --pretty`).
- Values are cost-only per model; tokens are not split by model in CodexBar output.
## References
- Read `references/codexbar-cli.md` for CLI flags and cost JSON fields.
This skill summarizes per-model usage and cost from CodexBar local cost logs for Codex or Claude. It supports a "current model" view (most recent daily entry) and a full breakdown across all models. Use it when you need a scriptable, human- or machine-readable summary of model-level costs from CodexBar JSON output.
The script reads CodexBar cost JSON produced by the CLI or from a file/stdin. For the current-model mode it selects the most recent daily row with modelBreakdowns and picks the highest-cost model, falling back to the last modelsUsed entry if breakdowns are missing. In all-models mode it aggregates cost values per model and emits text or JSON output.
What input sources are supported?
The script supports running the CodexBar CLI directly, reading a JSON file via --input, or accepting JSON on stdin.
How does current-mode pick the model?
It uses the most recent daily row with modelBreakdowns and selects the highest-cost model, falling back to the last modelsUsed entry if breakdowns are missing; you can override with --model.