Bring your project into LLM context - tool and MCP server
Configuration
View docs{
"mcpServers": {
"smat-dev-jinni": {
"command": "uvx",
"args": [
"jinni-server"
]
}
}
}Jinni MCP Server helps you bring your project into context for Large Language Models by collecting relevant files and generating a concise, configurable context dump. It integrates with MCP clients to provide a read_context tool that returns a concatenated, headered view of selected project content, making it easy to guide LLMs with precise code and file snippets.
Set up a client that supports MCP (for example, Claude Desktop, Cursor, or Roo) to run the jinni MCP Server. Your model can request the project context through the read_context tool to obtain a single string containing the chosen files and their contents, with each file preceded by a header like `path=src/app.py`.
Configure the server in your client so that the model can invoke read_context with a target project_root, a list of targets, and the rules to apply. You can restrict the read scope to specific modules or paths or allow the model to request the whole project by omitting targets. If the resulting context is too large, you will receive a DetailedContextSizeError that includes the 10 largest contributing files to help you prune context.
Prerequisites: you need Python and a way to run auxiliary servers (uv/uvx). Install the MCP server package via Python’s package manager and prepare to run the server entry point.
pip install jinni
# Or use uv to install and run the server package in an environment that supports uvx
uv pip install jinni
# The MCP client side (example) needs to know how to reach the server; you will typically configure it in your MCP client config as shown in the next section.MCP client configuration for the jinni server uses a stdio (local) server setup. The server is started by invoking the uvx tool with the jinni-server entry point.
{
"mcpServers": {
"jinni": {
"command": "uvx",
"args": ["jinni-server"]
}
}
}If you want to limit the server’s read scope to a specific directory, you can add a root path to constrain filesystem access. For example, add an optional root argument to the command: --root /absolute/path/.
Jinni uses .gitignore-style patterns to decide which files to include or exclude. It also supports .contextfiles and an overrides mechanism to make custom, project-wide rules take precedence. You can explicitly include or exclude patterns and provide targeting rules to control what content is processed.
Common usage patterns include excluding test data, large binaries, or log files, and opting into including only specific directories or file types. See the examples below for common CLI and MCP configurations.
# Exclude all test directories
jinni --not tests
# Exclude multiple keywords
jinni --not tests --not vendor --not deprecated
# Exclude old code only in specific paths
jinni --not-in src/legacy:old,deprecated --not-in lib/v1:legacy
# Exclude specific file patterns
jinni --not-files "*.test.js" --not-files "*_old.*"
# Keep only src and docs, exclude everything else
jinni --keep-only src,docs
# Combine different exclusion types
jinni --not tests --not-in src/experimental:wip --not-files "*.bak"If the context size exceeds the configured limit, the server returns a DetailedContextSizeError that lists the largest files contributing to the size. Use this information to prune content with .contextfiles rules or overrides to reduce the included scope.
To resolve large context issues, consider increasing the size limit (with --size-limit-mb on the CLI or size_limit_mb in MCP) or refine your inclusion rules to exclude unnecessary files such as large data files, logs, or artifacts.
If you need to refer to these instructions while troubleshooting, you can use the jinni usage tool to retrieve the current usage guidance.
The server exposes a read_context tool for MCP clients to request a concatenated, headered view of relevant project files. The CLI also provides a way to dump context manually for feeding into LLMs or pasting into other tools.
Key features include efficient context gathering, intelligent filtering with Gitignore-style rules, dynamic rule application, and safe targeting controls to limit scope while preserving relative paths in the output.
Cursor and similar clients can request context via read_context, but may silently drop large context. If you see missing tool calls, reduce the scope of what is read by narrowing targets or adjusting rules.
Exposes a tool for MCP clients to request a concatenated, headered view of relevant project files from a specified project root and targets.
A dedicated tool that returns the content of the README-like guidance for the MCP server when invoked without arguments.
CLI utility to manually dump the project context dump for copy-paste or feeding into LLMs.