home / skills / kkkkhazix / khazix-skills / github-to-skills

github-to-skills skill

/github-to-skills

This skill converts a GitHub repository URL into a ready-to-use AI skill with standardized structure and extended metadata for lifecycle management.

npx playbooks add skill kkkkhazix/khazix-skills --skill github-to-skills

Review the files below or copy the command above to add this skill to your agents.

Files (3)
SKILL.md
2.7 KB
---
name: github-to-skills
description: Automated factory for converting GitHub repositories into specialized AI skills. Use this skill when the user provides a GitHub URL and wants to "package", "wrap", or "create a skill" from it. It automatically fetches repository details, latest commit hashes, and generates a standardized skill structure with enhanced metadata suitable for lifecycle management.
license: MIT
---

# GitHub to Skills Factory

This skill automates the conversion of GitHub repositories into fully functional AI skills.

## Core Functionality

1. **Analysis**: Fetches repository metadata (Description, README, Latest Commit Hash).
2. **Scaffolding**: Creates a standardized skill directory structure.
3. **Metadata Injection**: Generates `SKILL.md` with extended frontmatter (tracking source, version, hash) for future automated management.
4. **Wrapper Generation**: Creates a `scripts/wrapper.py` (or similar) to interface with the tool.

## Usage

**Trigger**: `/GitHub-to-skills <github_url>` or "Package this repo into a skill: <url>"

### Required Metadata Schema

Every skill created by this factory MUST include the following extended YAML frontmatter in its `SKILL.md`. This is critical for the `skill-manager` to function later.

```yaml
---
name: <kebab-case-repo-name>
description: <concise-description-for-agent-triggering>
# EXTENDED METADATA (MANDATORY)
github_url: <original-repo-url>
github_hash: <latest-commit-hash-at-time-of-creation>
version: <tag-or-0.1.0>
created_at: <ISO-8601-date>
entry_point: scripts/wrapper.py # or main script
dependencies: # List main dependencies if known, e.g., ["yt-dlp", "ffmpeg"]
---
```

## Workflow

1. **Fetch Info**: The agent first runs `scripts/fetch_github_info.py` to get the raw data from the repo.
2. **Plan**: The agent analyzes the README to understand how to invoke the tool (CLI args, Python API, etc.).
3. **Generate**: The agent uses the `skill-creator` patterns to write the `SKILL.md` and wrapper scripts, ensuring the **extended metadata** is present.
4. **Verify**: Checks if the commit hash was correctly captured.

## Resources

- `scripts/fetch_github_info.py`: Utility to scrape/API fetch repo details (README, Hash, Tags).
- `scripts/create_github_skill.py`: Orchestrator to scaffold the folder and write the initial files.

## Best Practices for Generated Skills

- **Isolation**: The generated skill should install its own dependencies (e.g., in a venv or via `uv`/`pip`) if possible, or clearly state them.
- **Progressive Disclosure**: Do not dump the entire repo into the skill. Only include the necessary wrapper code and reference the original repo for deep dives.
- **Idempotency**: The `github_hash` field allows the future `skill-manager` to check `if remote_hash != local_hash` to trigger updates.

Overview

This skill automates converting a GitHub project URL into a packaged AI skill with standardized structure and metadata. It fetches project details and the latest commit hash, then generates a wrapped skill with lifecycle-ready metadata and a lightweight runtime wrapper. The result is a repeatable artifact ready for version checks and automated management.

How this skill works

Given a GitHub URL, the skill fetches core project data (description, docs, latest commit hash and tags) via the GitHub API or a fetch utility. It analyzes project docs to determine invocation patterns and dependencies, then scaffolds a skill folder, injects an extended metadata file, and creates a small wrapper script to expose the tool to the agent. Finally, it verifies the captured commit hash and reports creation details for future update checks.

When to use it

  • You have a GitHub project URL and want a ready-to-manage AI skill from it.
  • You need a consistent, metadata-rich wrapper so skills can be tracked and updated automatically.
  • You want to extract just the invocation surface and dependencies rather than importing the full codebase.
  • You need an artifact that supports idempotent updates via commit-hash comparison.
  • You want to standardize many projects into a uniform skill catalog for lifecycle tooling.

Best practices

  • Include only the necessary wrapper code and list key dependencies instead of bundling the entire project.
  • Record the source URL and exact commit hash in the generated metadata to enable reliable update checks.
  • Create an isolated runtime (venv or pinned deps) or clearly declare required packages for the wrapper.
  • Keep the wrapper interface minimal and document expected CLI or API invocation patterns in the metadata.
  • Ensure creation is idempotent so re-running the factory only updates when the source hash changes.

Example use cases

  • Packaging a CLI utility from a public GitHub project into a skill that agents can invoke.
  • Wrapping a Python library with a small script that exposes core functions to the agent ecosystem.
  • Bulk-converting multiple project URLs into cataloged skills with consistent metadata for tracking.
  • Generating a lightweight proxy skill that references the original project for deep dives while keeping the runtime slim.
  • Automating periodic checks to detect when a remote project has changed and trigger skill regeneration.

FAQ

Will this include the full project code in the generated skill?

No. The factory extracts the invocation surface and key dependencies, producing a lightweight wrapper and metadata while referencing the original source for full details.

How does the skill detect updates to the original project?

The generated metadata records the source URL and latest commit hash; a lifecycle manager can compare the stored hash to the remote hash to decide if regeneration is needed.