home / mcp / typescript mcp agent with ollama integration mcp server
Connects MCP servers to Ollama for filesystem and web research tooling in interactive chats.
Configuration
View docs{
"mcpServers": {
"ausboss-mcp-ollama-agent": {
"command": "npx",
"args": [
"@modelcontextprotocol/server-filesystem",
"./"
]
}
}
}You learn how to use the TypeScript MCP Agent with Ollama Integration to connect MCP servers to Ollama, enabling tools and models to work together through a unified interface. This setup lets you run local tooling for filesystem operations and web research while coordinating with an Ollama-backed model for interactive conversations and tool usage.
You can run tools via MCP servers that are wired into Ollama. Start a chat session with Ollama, and the system will invoke the appropriate MCP servers to perform tasks such as filesystem operations or web research. The agent supports multiple servers and can interact with tools through a unified interface, enabling smooth, tool-assisted conversations.
Prerequisites you need before installing any components.
Install Node.js (version 18 or higher) and ensure Ollama is installed and running.
Install the MCP tools globally that you want to use. For filesystem operations, install the filesystem server package. For web research, install the web research server package.
# For filesystem operations
npm install -g @modelcontextprotocol/server-filesystem
# For web research
npm install -g @mzxrai/mcp-webresearchConfigure your MCP servers and Ollama model in the configuration. You will specify the MCP servers with their runtime commands and arguments, then point Ollama to the model you want to use.
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "./"]
},
"webresearch": {
"command": "npx",
"args": ["-y", "@mzxrai/mcp-webresearch"]
}
},
"ollama": {
"host": "http://localhost:11434",
"model": "qwen2.5:latest"
}
}Once configured, you can start the demo in test mode to verify filesystem and web research tools without a live LLM. Run the demo to test tooling interactions and then start the interactive chat interface to begin conversations with Ollama while using the MCP tool servers.
To test locally, you can run the demo with the TypeScript runtime, then start the chat interface to engage with Ollama. The example usage shows how the model uses tools to list a directory and read a file, demonstrating how tool calls appear in a conversation.
The setup supports multiple MCP servers (for example, filesystem and web research) and can use either Python-based UVX or Node.js-based NPM/NPX MCP servers. You can customize the system prompt to improve tool usage and adapt the flow to your needs.
List contents of a directory to inspect files and subfolders.
Read and return the contents of a file at a given path.