LLM Code Context MCP server

Streamlines code context sharing with LLMs by implementing smart file selection, code outlining, and multi-language support for efficient code reviews and documentation generation.
Back to servers
Setup instructions
Provider
cyberchitta
Release date
Dec 06, 2024
Language
Python
Package
Stats
12.9K downloads
280 stars

LLM Context is a tool that streamlines providing context to LLMs, allowing you to share relevant project files instantly through smart selection and rule-based filtering. It reduces the friction in getting your project information into AI chat interfaces, eliminating manual copying/pasting and helping identify which files are most relevant.

Installation

You can install LLM Context using UV package manager:

uv tool install "llm-context>=0.5.0"

Quick Start

Basic Usage

The basic workflow involves three simple steps:

# One-time setup
cd your-project
lc-init

# Daily usage
lc-select
lc-context

MCP Integration

For a more seamless experience, it's recommended to use MCP integration. Add the following to your MCP configuration:

{
  "mcpServers": {
    "llm-context": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}

With MCP integration, AI assistants can access additional files directly during conversations without requiring manual file uploads.

Project Customization

You can customize how LLM Context works with your project by creating specific filter rules:

# Create project-specific filters
cat > .llm-context/rules/flt-repo-base.md << 'EOF'
---
compose:
  filters: [lc/flt-base]
gitignores:
  full-files: ["*.md", "/tests", "/node_modules"]
---
EOF

# Customize main development rule
cat > .llm-context/rules/prm-code.md << 'EOF'
---
instructions: [lc/ins-developer, lc/sty-python]
compose:
  filters: [flt-repo-base]
  excerpters: [lc/exc-base]
---
Additional project-specific guidelines and context.
EOF

Core Commands

Here are the main commands you'll use with LLM Context:

Command Purpose
lc-init Initialize project configuration
lc-select Select files based on current rule
lc-context Generate and copy context
lc-context -nt Generate context for non-MCP environments
lc-set-rule <name> Switch between rules
lc-missing Handle file and context requests (non-MCP)

Rule System

LLM Context uses a systematic five-category structure for rules:

  • Prompt Rules (prm-): Generate project contexts
  • Filter Rules (flt-): Control file inclusion
  • Instruction Rules (ins-): Provide guidelines
  • Style Rules (sty-): Enforce coding standards
  • Excerpt Rules (exc-): Configure extractions for context reduction

Example Rule

Here's an example of a rule focused on debugging authentication issues:

---
description: "Debug API authentication issues"
compose:
  filters: [lc/flt-no-files]
  excerpters: [lc/exc-base]
also-include:
  full-files: ["/src/auth/**", "/tests/auth/**"]
---
Focus on authentication system and related tests.

Workflow Patterns

Daily Development

For regular development work:

lc-set-rule lc/prm-developer
lc-select
lc-context
# AI can review changes, access additional files as needed

Focused Tasks

For specific tasks requiring minimal context:

# Let AI help create minimal context
lc-set-rule lc/prm-rule-create
lc-context -nt
# Work with AI to create task-specific rule using tmp-prm- prefix

Key Features

  • Smart File Selection: Rules automatically include/exclude appropriate files
  • Instant Context Generation: Formatted context copied to clipboard in seconds
  • MCP Integration: AI can access additional files without manual intervention
  • Systematic Rule Organization: Five-category system for clear rule composition
  • AI-Assisted Rule Creation: Let AI help create minimal context for specific tasks
  • Code Excerpting: Extractions of significant content to reduce context while preserving structure

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "CyberChitta" '{"command":"uvx","args":["--from","llm-context","lc-mcp"]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "CyberChitta": {
            "command": "uvx",
            "args": [
                "--from",
                "llm-context",
                "lc-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "CyberChitta": {
            "command": "uvx",
            "args": [
                "--from",
                "llm-context",
                "lc-mcp"
            ]
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later