home / skills / mhattingpete / claude-skills-marketplace / file-operations

This skill analyzes files and retrieves metadata such as size, line counts, modification times, and content statistics to inform your decisions.

npx playbooks add skill mhattingpete/claude-skills-marketplace --skill file-operations

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
2.5 KB
---
name: file-operations
description: Analyze files and get detailed metadata including size, line counts, modification times, and content statistics. Use when users request file information, statistics, or analysis without modifying files.
---

# File Operations

Analyze files and retrieve metadata using Claude's native tools without modifying files.

## When to Use

- "analyze [file]"
- "get file info for [file]"
- "how many lines in [file]"
- "compare [file1] and [file2]"
- "file statistics"

## Core Operations

### File Size & Metadata
```bash
stat -f "%z bytes, modified %Sm" [file_path]  # Single file
ls -lh [directory]                             # Multiple files
du -h [file_path]                              # Human-readable size
```

### Line Counts
```bash
wc -l [file_path]                              # Single file
wc -l [file1] [file2]                          # Multiple files
find [dir] -name "*.py" | xargs wc -l          # Directory total
```

### Content Analysis
Use **Read** to analyze structure, then count functions/classes/imports.

### Pattern Search
```
Grep(pattern="^def ", output_mode="count", path="src/")        # Count functions
Grep(pattern="TODO|FIXME", output_mode="content", -n=true)    # Find TODOs
Grep(pattern="^import ", output_mode="count")                 # Count imports
```

### Find Files
```
Glob(pattern="**/*.py")
```

## Workflow Examples

### Comprehensive File Analysis
1. Get size/mod time: `stat -f "%z bytes, modified %Sm" file.py`
2. Count lines: `wc -l file.py`
3. Read file: `Read(file_path="file.py")`
4. Count functions: `Grep(pattern="^def ", output_mode="count")`
5. Count classes: `Grep(pattern="^class ", output_mode="count")`

### Compare File Sizes
1. Find files: `Glob(pattern="src/**/*.py")`
2. Get sizes: `ls -lh src/**/*.py`
3. Total size: `du -sh src/*.py`

### Code Quality Metrics
1. Total lines: `find . -name "*.py" | xargs wc -l`
2. Test files: `find . -name "test_*.py" | wc -l`
3. TODOs: `Grep(pattern="TODO|FIXME|HACK", output_mode="count")`

### Find Largest Files
```bash
find . -type f -not -path "./node_modules/*" -exec du -h {} + | sort -rh | head -20
```

## Best Practices

- **Non-destructive**: Use Read/stat/wc, never modify
- **Efficient**: Read small files fully, use Grep for large files
- **Context-aware**: Compare to project averages, suggest optimizations

## Integration

Works with:
- **code-auditor**: Comprehensive analysis
- **code-transfer**: After identifying large files
- **codebase-documenter**: Understanding file purposes

Overview

This skill analyzes files and returns detailed metadata and content statistics without modifying files. It extracts size, modification times, line counts, and pattern-based content metrics. Use it to get quick, non-destructive insights about individual files or entire directories.

How this skill works

The skill inspects file system metadata (size, modification timestamp) and computes line counts with lightweight scanning. For content analysis it reads files and runs pattern searches to count functions, imports, TODOs, or other regex matches. It favors efficient operations (stat, wc, grep-style scans, and globbing) and avoids any write operations.

When to use it

  • Request file info or a summary for one or more files
  • Count lines, functions, classes, imports, or TODOs in source files
  • Compare file sizes or modification times across a project
  • Find large files or unusual files that impact build or deploy time
  • Audit codebase metrics without changing code or repository state

Best practices

  • Always run non-destructive commands (stat, wc, read, grep) to avoid accidental changes
  • Read small files fully for deeper analysis; use pattern searches for large files
  • Limit scope with glob patterns or directory filters to improve performance
  • Compare metrics against project averages to identify outliers
  • Exclude common vendor directories (node_modules, .venv) when searching for largest files

Example use cases

  • Get size and last-modified time for a specific source file before a code review
  • Count total lines and number of test files to estimate test coverage effort
  • Find all TODO/FIXME occurrences to prioritize technical-debt work
  • Locate the top 20 largest files to reduce repository bloat or speed up CI
  • Compare two file versions by line counts and key-pattern frequencies (functions/imports)

FAQ

Will this skill modify my files?

No. All operations are read-only: metadata queries, line counts, pattern searches, and file reads only; no write or delete actions are performed.

How does it scale for large repositories?

Use scoped glob patterns and directory filters to narrow searches. For large files, prefer grep-style pattern counts rather than reading entire files to save time and memory.