LLMBasedOS MCP server

Secure Arch Linux gateway that bridges LLMs with local system capabilities through specialized servers for file system, email, sync, and agent operations without exposing sensitive information
Back to servers
Provider
iluxu
Release date
May 17, 2025
Language
Go
Stats
247 stars

This MCP server provides a cognitive operating system that transforms your computer into an autonomous partner through the Model Context Protocol (MCP), allowing AI models to interact with system capabilities via a JSON-RPC layer running over UNIX and WebSockets.

Getting Started with llmbasedos

Prerequisites

  • Docker and Docker Compose installed
  • OpenAI, Gemini, or other LLM API keys (depending on which models you plan to use)

Installation

Follow these steps to deploy the MCP server using Docker:

  1. Install Docker and Docker Compose on your system

  2. Clone the repository

  3. Organize the source files in the llmbasedos_src/ directory

  4. Create configuration files:

    • .env file with your API keys and configuration
    • lic.key for license information
    • mail_accounts.yaml for email configuration
    • Add any user files you need
  5. Build the Docker containers:

docker compose build
  1. Start the MCP server:
docker compose up
  1. Connect to the server using luca-shell to begin issuing MCP calls

Using the MCP Server

Core Components

The MCP server architecture includes:

  • Gateway: Routes MCP traffic and exposes LLM abstraction
  • MCP Servers: Microservices exposing files, email, web, and more
  • Shell: REPL interface for exploring and scripting against the MCP system

Making MCP Calls

The primary way to interact with the system is through mcp_call() functions in Python scripts. Here's an example:

# Read content from a file
history = json.loads(mcp_call("mcp.fs.read", ["/outreach/contact_history.json"]).get("result", {}).get("content", "[]"))

# Use an LLM to process the data
prompt = f"Find 5 new agencies not in: {json.dumps(history)}"
llm_response = mcp_call("mcp.llm.chat", [[{"role": "user", "content": prompt}], {"model": "gemini-1.5-pro"}])

# Extract the response and write back to a file
new_prospects = json.loads(llm_response.get("result", {}).get("choices", [{}])[0].get("message", {}).get("content", "[]"))

if new_prospects:
    updated = history + new_prospects
    mcp_call("mcp.fs.write", ["/outreach/contact_history.json", json.dumps(updated, indent=2), "text"])

Available MCP Servers

  • fs: File system operations and FAISS semantic search
  • mail: IMAP email parsing and draft handling
  • sync: File synchronization operations via rclone
  • agent: Legacy YAML workflow engine (being deprecated)

LLM Integration

The system is LLM-agnostic and can work with:

  • OpenAI models
  • Google Gemini
  • LLaMA.cpp
  • Other local models

The gateway routes all mcp.llm.chat requests through your preferred backend, which you can configure in the settings.

Security Features

  • Virtual path jail (e.g., /mnt/user_data)
  • License-based tier enforcement
  • Environment-based secrets (no hardcoded keys)
  • Read-only container volumes for configuration

Troubleshooting

If you encounter issues connecting to the MCP server:

  • Ensure Docker containers are running properly
  • Check that your configuration files are correctly formatted
  • Verify that your API keys in the .env file are valid
  • Examine Docker logs for any error messages

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later