Home / MCP / Ollama MCP Guidance MCP Server
Provides a standardized MCP server to interact with Ollama services via a unified API for LLM guidance.
Configuration
View docs{
"mcpServers": {
"ollama_mcp_guidance": {
"command": "python",
"args": [
"ollama_mcp_server.py",
"\"$@\""
],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"OLLAMA_TIMEOUT": "30",
"OLLAMA_USER_AGENT": "Ollama_MCP_Guidance/1.0"
}
}
}
}Ollama MCP Guidance provides a standardized MCP server interface to interact with Ollama services, enabling LLMs to call and reason about Ollama APIs through a unified protocol. This server-oriented approach helps ensure consistent responses, error handling, and clear navigation of available API features within the MCP ecosystem.
To use this MCP server with a client, connect via the MCP endpoints exposed by the Ollama_MCP_Guidance server. The server is designed to run in Cursor MCP environments and acts as a bridge between your MCP client and the Ollama service, exposing a standardized JSON response format, robust error handling, and navigable API documentation. Use the MCP client’s standard request flow to invoke the Ollama-related actions provided by this server, and rely on the built-in documentation paths for guidance on available features.
# Prerequisites
# - Python 3.10 or higher
# - Ollama installed
# - uv (recommended) for environment management
# Install uv (recommended)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create and activate a virtual environment
uv venv
source .venv/bin/activate # Linux/macOS
# or
.venv\Scripts\activate # Windows
# Install the project dependencies
uv pip install .Configuration and environment settings are described in the project’s config.json file. The server is intended to be run in an MCP-enabled Cursor environment and is not accompanied by a standalone client. The documentation is bilingual (Chinese and English) and may be updated to English predominance in future versions.
The project is in active development and currently focuses on providing a Cursor MCP Server workflow. Not all Ollama API endpoints may be fully implemented yet. For security, avoid exposing sensitive endpoints publicly and use authenticated MCP clients when integrating with production workloads.
If you are using an AI assistant, ensure it has invoked the get_started_guide tool to obtain comprehensive usage instructions and best practices for leveraging this MCP server effectively.