home / skills / a5c-ai / babysitter / code-quality-analyzer

This skill performs static code analysis, assesses technical debt, and measures engineering velocity to illuminate code health and prioritization.

npx playbooks add skill a5c-ai/babysitter --skill code-quality-analyzer

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
3.0 KB
---
name: code-quality-analyzer
description: Static code analysis, technical debt assessment, engineering velocity metrics
allowed-tools:
  - Read
  - Write
  - Glob
  - Grep
  - Bash
  - WebFetch
metadata:
  specialization: venture-capital
  domain: business
  skill-id: vc-skill-014
---

# Code Quality Analyzer

## Overview

The Code Quality Analyzer skill provides detailed code-level analysis for technical due diligence. It performs static code analysis, assesses technical debt, and evaluates engineering team velocity to understand code health and development productivity.

## Capabilities

### Static Code Analysis
- Run automated code quality checks
- Identify code smells and anti-patterns
- Measure code complexity metrics
- Detect potential bugs and vulnerabilities

### Technical Debt Assessment
- Quantify technical debt backlog
- Identify high-priority refactoring needs
- Assess test coverage and quality
- Evaluate documentation completeness

### Engineering Velocity Metrics
- Measure deployment frequency
- Track lead time for changes
- Analyze cycle time and throughput
- Assess sprint velocity trends

### Code Health Indicators
- Analyze code churn patterns
- Review pull request metrics
- Assess code review practices
- Evaluate dependency management

## Usage

### Analyze Code Quality
```
Input: Repository access, analysis parameters
Process: Run static analysis, aggregate metrics
Output: Code quality report, issue summary
```

### Assess Technical Debt
```
Input: Codebase access, debt categorization
Process: Inventory debt, estimate remediation
Output: Technical debt assessment, prioritization
```

### Measure Engineering Velocity
```
Input: Git history, project management data
Process: Calculate velocity metrics
Output: Velocity report, trend analysis
```

### Review Code Health
```
Input: Repository data, team practices
Process: Analyze patterns, compare benchmarks
Output: Code health scorecard, recommendations
```

## Key Metrics

| Metric | Description | Target Range |
|--------|-------------|--------------|
| Test Coverage | % of code covered by tests | 70-90% |
| Code Complexity | Cyclomatic complexity average | < 10 |
| Tech Debt Ratio | Debt remediation time / dev time | < 5% |
| Deployment Frequency | Deployments per week | Daily to weekly |
| Change Failure Rate | % of deployments causing issues | < 15% |

## Integration Points

- **Technical Due Diligence**: Detailed code analysis for DD
- **Tech Stack Scanner**: Complement architecture review
- **Technical Assessor (Agent)**: Support agent analysis
- **IP Patent Analyzer**: Code-level IP assessment

## Analysis Tools Integration

- SonarQube for code quality
- CodeClimate for maintainability
- GitHub/GitLab analytics
- Jira/Linear for velocity data
- Custom scripts for specific checks

## Best Practices

1. Calibrate expectations by company stage
2. Focus on trends over absolute numbers
3. Consider context of rapid iteration
4. Balance debt against velocity needs
5. Assess relative to team size and resources

Overview

This skill performs static code analysis, technical debt assessment, and engineering velocity measurement to deliver an actionable view of code health and team productivity. It combines automated linting and complexity metrics with repository and project-management signals to prioritize remediation and guide technical due diligence. The outputs are reports, scorecards, and ranked action lists tailored to company stage and risk tolerance.

How this skill works

The analyzer ingests repository access, CI artifacts, and project management data to run static analyzers, dependency checks, and custom heuristics. It aggregates metrics such as cyclomatic complexity, test coverage, code churn, PR cadence, and deployment frequency, then normalizes them against configurable benchmarks. Results include a technical debt inventory, remediation estimates, and velocity trend analysis with recommended next steps.

When to use it

  • During technical due diligence for acquisitions or investments
  • Before major refactors or platform migrations to quantify risk
  • Regular engineering health checks to monitor trends and regressions
  • When prioritizing bug fixes versus feature work based on debt impact
  • To benchmark teams and quantify improvements from process changes

Best practices

  • Calibrate metric targets to company stage rather than using absolute values
  • Emphasize trend analysis over single-point scores to catch regressions early
  • Combine automated findings with developer context to avoid false positives
  • Prioritize high-impact, low-effort remediation items first
  • Use velocity metrics alongside qualitative feedback to interpret productivity

Example use cases

  • Generate a pre-acquisition code quality dossier highlighting security hotspots and debt estimates
  • Produce a sprintly engineering dashboard showing lead time for changes and PR throughput
  • Run a maintainability scan to identify modules with high complexity and low test coverage
  • Estimate effort to remediate critical technical debt and provide a prioritized backlog
  • Compare dependency risk and update needs across multiple repositories

FAQ

What inputs are required to run a full analysis?

Repository access (git), CI logs or coverage reports when available, and project-management data such as issue or ticket histories for velocity calculations.

Which tools does it integrate with?

Common integrations include SonarQube, CodeClimate, GitHub/GitLab analytics, and Jira/Linear, plus custom scripts for bespoke checks.