home / skills / phrazzld / claude-config / moneta-reconcile

moneta-reconcile skill

/skills/moneta-reconcile

This skill verifies Moneta accounting integrity by reconciling source docs, lots to holdings, and detecting duplicates and gaps.

npx playbooks add skill phrazzld/claude-config --skill moneta-reconcile

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
1.3 KB
---
name: moneta-reconcile
description: |
  Verify accounting integrity. Compare totals to source docs, check lots vs holdings, detect duplicates, report gaps.
user-invocable: true
effort: high
---

# /moneta-reconcile

Verify Moneta accounting integrity.

## Steps

1. Load source docs from `source/` and parsed outputs from `normalized/`.
2. Compare per-source transaction counts and totals to originals.
3. Reconcile lots to holdings: sum lots per asset vs `normalized/cost-basis.json` and `normalized/cost-basis-updated.json`.
4. Detect duplicate transactions by `id`, date+amount+source, and cross-file overlaps.
5. Report discrepancies with file path, record id, and delta.

## Examples

```bash
# Refresh normalized data before reconciling
pnpm parse:all
```

```bash
# Rebuild gains before lot checks
pnpm gains
```

## References

- `source/`
- `normalized/transactions.json`
- `normalized/bofa-transactions.json`
- `normalized/river-transactions.json`
- `normalized/strike-transactions.json`
- `normalized/cashapp-transactions.json`
- `normalized/robinhood-transactions.json`
- `normalized/cost-basis.json`
- `normalized/cost-basis-updated.json`
- `normalized/river-lots.json`
- `normalized/strike-lots.json`
- `normalized/robinhood-lots.json`
- `scripts/parse-all.ts`
- `scripts/schema.ts`

Overview

This skill verifies accounting integrity for Moneta by comparing parsed financial data to original source documents and internal summaries. It flags mismatches in transaction counts, total amounts, lot-to-holding inconsistencies, duplicates, and missing records. The output lists file paths, record IDs, and numeric deltas to make remediation straightforward.

How this skill works

The skill loads original files from the source directory and the parsed/normalized outputs. It compares per-source transaction counts and totals to the originals, sums lots per asset and reconciles them to cost-basis files, and scans for duplicate transactions by id, by date+amount+source, and across normalized files. Discrepancies are reported with clear context: which file, which record, and how much the values differ.

When to use it

  • After running parsers to ensure normalized data matches original source files
  • Before preparing tax or gain reports to validate cost-basis and lot integrity
  • When investigating unexpected account balances or missing transactions
  • Regular integrity checks as part of CI/CD or a reconciliation schedule
  • Prior to merging or publishing financial snapshots

Best practices

  • Refresh normalized data (parse all) before running reconciliation to avoid false positives
  • Rebuild gains and update cost-basis before lot reconciliation to reflect latest allocations
  • Run duplicate detection across all normalized files to catch cross-file overlaps
  • Fix discrepancies at the source file level and re-run checks rather than patching outputs
  • Keep a changelog of reconciliations and the actions taken for auditability

Example use cases

  • Validate that bank and broker transaction totals match original statement exports
  • Detect duplicated imports that create double-counted balances
  • Reconcile lots from trading platforms to cost-basis summaries to ensure holdings match
  • Locate gaps where source records are missing from normalized outputs
  • Confirm that rebuilds of gains did not introduce lot or basis mismatches

FAQ

What inputs does the reconciliation require?

It requires original source files plus normalized outputs including transactions and cost-basis files.

How are duplicates detected?

Duplicates are detected by matching transaction id, by matching date+amount+source signatures, and by scanning for cross-file overlaps among normalized transaction files.