home / mcp / browser use ollama mcp server
Local MCP server to automate browser actions via Ollama models using Playwright and MC protocol.
Configuration
View docs{
"mcpServers": {
"cam10001110101-mcp-server-browser-use-ollama": {
"command": "/path/to/.venv/bin/python",
"args": [
"/path/to/src/server.py"
],
"env": {
"OLLAMA_HOST": "http://localhost:11434",
"OLLAMA_MODEL": "qwen3"
}
}
}
}You can automate browser tasks locally by exposing an MCP server that communicates with Ollama-hosted models and controls a Playwright browser. This setup gives you natural language-driven automation, robust session management, and visual feedback through screenshots, all running securely on your machine.
You interact with the MCP server using an MCP client or an interactive Python client. Start a local server process, connect your client to it, and begin giving natural language instructions like navigating to a page, clicking elements, typing text, and extracting data. The server translates your requests into browser actions via Playwright and returns results or visual feedback after each step so you can adjust your approach in real time.
Common work patterns include starting a long-running interactive session for a task, then issuing a sequence of commands or a single complex instruction that the AI expands into browser actions. You can provide task descriptions from the command line or from a text file, with model-driven planning guiding the actions and a full conversation history retained for context.
Prerequisites: install Python 3.8 or newer and ensure Ollama is installed and running on your machine. You should also have a modern Python package manager available.
Step 1: Install the package and dependencies locally using uv as the installer, then install browser tooling.
# Clone the project repository
git clone https://github.com/Cam10001110101/mcp-server-browser-use-ollama
cd mcp-server-browser-use-ollama
# Install with uv (recommended)
uv pip install -e .
playwright install
# Start Ollama and pull a model in separate terminals
ollama serve
ollama pull qwen3You can run the MCP server locally by connecting your client to a Python runtime that executes the server script. The following example shows how you wire the server in a configuration file.
{
"mcpServers": {
"browser_use_ollama": {
"command": "/path/to/.venv/bin/python",
"args": ["/path/to/src/server.py"]
}
}
}Launch a browser instance and navigate to a URL.
Click at specific coordinates within the page.
Click an element identified by a CSS selector.
Type text at the current cursor position.
Scroll the page up or down.
Extract readable text content from the current page.
Retrieve the DOM structure up to a specified depth.
Extract structured data matching a pattern from the page content.
Capture a screenshot of the current browser view.
Close the active browser session.