home / skills / velcrafting / codex-skills / data-fetching-integration
This skill wires a UI to a data source using repo tooling with typed inputs/outputs and robust loading, error, and caching behavior.
npx playbooks add skill velcrafting/codex-skills --skill data-fetching-integrationReview the files below or copy the command above to add this skill to your agents.
---
name: data-fetching-integration
description: Wire UI to a data source with typed inputs/outputs, loading/error/empty states, and caching rules.
metadata:
short-description: Data hookup + UI states
layer: frontend
mode: write
idempotent: false
---
# Skill: frontend/data-fetching-integration
## Purpose
Connect UI to an API/data source using the repo’s data tooling, ensuring:
- typed inputs/outputs
- correct UI state handling
- explicit caching and invalidation behavior
---
## Inputs
- Data source definition:
- endpoint contract (method/path/request/response) OR
- client function signature OR
- data description (if contract not formalized yet)
- UI entry point(s) that need the data
- Expected behavior:
- read vs write
- polling or realtime needs
- optimistic updates allowed (yes/no)
- Repo profile (preferred): `<repo>/REPO_PROFILE.json`
---
## Outputs
- Data call wired via repo tooling (React Query/SWR/custom)
- A hook / loader / data module consistent with repo patterns
- UI state coverage:
- loading
- error
- empty
- success
- Explicit cache key strategy and invalidation/refetch rules (code comments or helper constants)
---
## Non-goals
- Changing backend behavior or contracts
- Adding business rules client-side beyond presentation and basic form validation
- Building a new data layer abstraction unless explicitly requested
---
## Workflow
1) Identify the repo’s data fetching pattern (prefer `REPO_PROFILE.json`).
2) Implement the data call using existing client conventions.
3) Define a stable cache key derived from request inputs.
4) Add invalidation rules:
- mutations invalidate queries that depend on changed data
- avoid global invalidation unless required
5) Implement UI states:
- loading: skeleton/spinner (repo standard)
- error: recoverable messaging + retry path if appropriate
- empty: explicit “no data” state
- success: render
6) Run required validations (typecheck/lint/tests per profile).
---
## Checks
- Typecheck passes (if configured)
- UI handles loading/error/empty/success states deterministically
- Cache key is stable and unique
- Invalidation/refetch behavior is explicit
- No silent swallowing of errors
---
## Failure modes
- Data tooling unclear → ask which is used (React Query/SWR/custom) and default caching expectations.
- Stale cache → fix keys and invalidation rules; avoid ad-hoc refetch loops.
- Race conditions → add latest-only guards, cancellation, or stable request identity.
- Contract ambiguity → recommend `api/contract-update` or `shared/schema-types` before hard wiring.
---
## Telemetry
Log:
- skill: `frontend/data-fetching-integration`
- data_tooling: `react-query | swr | custom | unknown`
- cache_keys_added_or_changed
- files_touched
- outcome: `success | partial | blocked`
This skill wires a UI to a data source using the repository’s existing data tooling, producing typed inputs/outputs, deterministic cache keys, and explicit invalidation rules. It ensures the UI covers loading, error, empty, and success states and includes telemetry and validations required by the repo profile. The focus is predictable client-side behavior without changing backend contracts.
I inspect the repo profile (REPO_PROFILE.json) or existing client code to determine the data-fetching library and conventions. I implement a hook/loader or data module that calls the API client, emits typed responses, and defines a stable cache key and invalidation rules. The implementation includes deterministic UI states (loading, error, empty, success), typechecks, and comments/constants documenting caching strategy and refetch triggers.
What if the repo data tooling is unclear?
Ask which library to use (React Query, SWR, or custom). Default to the pattern seen in the repo profile or follow the most prevalent client implementation; document the choice in the PR.
How do you prevent stale caches or race conditions?
Use stable cache keys, explicit invalidation rules, and latest-only guards or request cancellation. Avoid ad-hoc refetch loops and prefer targeted invalidation on related mutations.