home / skills / eddiebe147 / claude-settings / peer-reviewer

peer-reviewer skill

/skills/peer-reviewer

This skill helps you conduct thorough peer reviews of academic and technical documents, extracting insights and synthesizing complex information efficiently.

npx playbooks add skill eddiebe147/claude-settings --skill peer-reviewer

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
3.4 KB
---
name: Peer Reviewer
slug: peer-reviewer
description: Conduct thorough peer reviews of academic and technical documents
category: research
complexity: simple
version: "1.0.0"
author: "ID8Labs"
triggers:
  - "peer review"
  - "review paper"
  - "academic review"
  - "technical review"
  - "manuscript review"
tags:
  - peer-review
  - academic
  - review
  - feedback
  - quality
---

# Peer Reviewer

Conduct thorough peer reviews of academic and technical documents

## When to Use This Skill

Use this skill when you need to:
- Analyze data and extract insights
- Conduct thorough investigation
- Synthesize complex information

**Not recommended for:**
- Tasks requiring creative content generation
- business operations

## Quick Reference

| Action | Command/Trigger |
|--------|-----------------|
| Create peer reviewer | `peer review` |
| Review and optimize | `review peer reviewer` |
| Get best practices | `peer reviewer best practices` |

## Core Workflows

### Workflow 1: Initial Peer Reviewer Creation

**Goal:** Create a high-quality peer reviewer from scratch

**Steps:**
1. **Discovery** - Understand requirements and objectives
2. **Planning** - Develop strategy and approach
3. **Execution** - Implement the plan
4. **Review** - Evaluate results and iterate
5. **Optimization** - Refine based on feedback

### Workflow 2: Advanced Peer Reviewer Optimization

**Goal:** Refine and optimize existing peer reviewer for better results

**Steps:**
1. **Research** - Gather relevant information
2. **Analysis** - Evaluate options and approaches
3. **Decision** - Choose the best path forward
4. **Implementation** - Execute with precision
5. **Measurement** - Track success metrics

## Best Practices

1. **Start with Clear Objectives**
   Define what success looks like before beginning work.

2. **Follow Industry Standards**
   Leverage proven frameworks and best practices in research.

3. **Iterate Based on Feedback**
   Continuously improve based on results and user input.

4. **Document Your Process**
   Keep track of decisions and outcomes for future reference.

5. **Focus on Quality**
   Prioritize excellence over speed, especially in early iterations.

## Checklist

Before considering your work complete:

- [ ] Objectives clearly defined and understood
- [ ] Research and discovery phase completed
- [ ] Strategy or plan documented
- [ ] Implementation matches requirements
- [ ] Quality standards met
- [ ] Stakeholders informed and aligned
- [ ] Results measured against goals
- [ ] Documentation updated
- [ ] Feedback collected
- [ ] Next steps identified

## Common Mistakes

| Mistake | Why It's Bad | Better Approach |
|---------|--------------|-----------------|
| Skipping research | Leads to misaligned solutions | Invest time in understanding context |
| Ignoring best practices | Reinventing the wheel | Study successful examples first |
| No clear metrics | Can't measure success | Define KPIs upfront |

## Integration Points

- **Tools**: Integration with common research platforms and tools
- **Workflows**: Fits into existing analysis and research workflows
- **Team**: Collaborates with research and analytics stakeholders

## Success Metrics

Track these metrics to measure effectiveness:
- Quality of output
- Time to completion
- Stakeholder satisfaction
- Impact on business goals
- Reusability of approach

---

*This skill is part of the ID8Labs Skills Marketplace. Last updated: 2026-01-07*

Overview

This skill performs structured, thorough peer reviews of academic and technical documents to improve clarity, rigor, and reproducibility. It focuses on assessing methodology, data interpretation, structure, and adherence to field standards. Use it to produce actionable feedback that authors can implement quickly.

How this skill works

I inspect manuscripts, reports, and technical drafts for conceptual soundness, methodological rigor, and presentation quality. I flag unclear reasoning, methodological gaps, statistical issues, and missing citations, then provide prioritized recommendations and a revision checklist. I can also evaluate against specific standards or journal guidelines when provided.

When to use it

  • Preparing a manuscript for submission to a journal or conference
  • Assessing a technical report or white paper before stakeholder review
  • Validating methods and data interpretation in a draft study
  • Improving clarity and structure of complex academic writing
  • Auditing reproducibility and compliance with field standards

Best practices

  • Start by defining review objectives and target audience to tailor feedback
  • Supply concrete examples and suggested rewordings rather than only criticism
  • Cross-check claims against cited sources and request evidence where needed
  • Prioritize issues by severity: critical methodological flaws first, style and formatting later
  • Document review decisions and provide a short, actionable revision plan

Example use cases

  • Pre-submission review to reduce desk-rejection risk and improve acceptance chances
  • Methodology audit to identify experimental design or statistical weaknesses
  • Clarity pass to make complex arguments accessible to interdisciplinary readers
  • Reproducibility check to ensure data, code, and procedures are described sufficiently
  • Stakeholder-ready summary highlighting key strengths, risks, and required revisions

FAQ

What file formats do you accept?

I can review plain text, LaTeX source, Markdown, and typical manuscript formats when content is pasted or provided as text. Provide special formatting or guidelines if needed.

Can you check statistical analyses and code?

Yes. I can evaluate statistical approaches, identify common pitfalls, and suggest alternatives. For code-level checks, include code snippets or a description of the workflow; I will point out reproducibility and correctness concerns.