Serena MCP server

Provides intelligent code analysis and manipulation across multiple programming languages through language server protocols, enabling developers to explore, understand, and refactor complex codebases.
Back to servers
Setup instructions
Provider
Oraios AI
Release date
Apr 05, 2025
Language
Go
Stats
17.2K stars

Serena is a powerful coding agent toolkit that transforms any LLM into a comprehensive agent capable of working directly with your codebase. It provides semantic code retrieval and editing tools similar to an IDE, allowing LLMs to efficiently find and modify code at the symbol level rather than working with entire files or basic string replacements.

Installation

Prerequisites

Serena requires uv for package management. If you don't already have it installed, follow these steps:

# Install uv from the official documentation
# Visit https://docs.astral.sh/uv/getting-started/installation/ for instructions

Starting the MCP Server

The quickest way to get Serena running is to start the MCP (Model Context Protocol) server directly from GitHub:

uvx --from git+https://github.com/oraios/serena serena start-mcp-server --help

This command will display all available options for configuring the MCP server.

Usage

Configuring Your Client

To use Serena with your preferred LLM client, you'll need to configure the client to connect to the Serena MCP server. The configuration varies depending on which client you're using:

Compatible Clients

Serena integrates with a wide range of LLM clients through the Model Context Protocol (MCP):

  • AI Assistants: Claude Code, Claude Desktop
  • Terminal-based clients: Codex, Gemini-CLI, Qwen3-Coder, rovodev, OpenHands CLI
  • IDEs: VSCode, Cursor, IntelliJ
  • Extensions: Cline, Roo Code
  • Local clients: OpenWebUI, Jan, Agno, and others

Alternative Integration Methods

If your preferred client doesn't support MCP directly, you can:

  • Use mcpo to connect Serena to ChatGPT or other clients that support OpenAPI-based tool calling
  • Incorporate Serena's tools into your custom agent framework

Programming Language Support

Serena leverages language servers through the Language Server Protocol (LSP) to provide semantic code analysis for over 30 programming languages, including:

  • Python, JavaScript, TypeScript
  • C/C++, C#, Java
  • Go, Rust, Swift
  • Ruby, PHP, Kotlin
  • And many others

Some languages may require additional dependencies to be installed for full functionality.

Project-Based Workflow

Serena operates best with a project-based approach:

  1. Navigate to your project directory
  2. Start the Serena MCP server
  3. Connect your LLM client to the server
  4. Let the LLM use Serena's tools to navigate and edit your codebase

Configuration Options

Serena can be customized for your specific needs through various configuration options. The most important configurations include:

  • Project-specific settings
  • Language server configurations
  • Tool availability and behavior

For detailed information about all available configuration options, refer to the user guide at https://oraios.github.io/serena/02-usage/050_configuration.html.

Best Practices

For optimal results with Serena:

  • Initialize Serena in the root directory of your project
  • Use a client that supports the MCP protocol for the best experience
  • Leverage project-based workflows as described in the documentation
  • Familiarize yourself with the available tools to understand Serena's capabilities

Serena is particularly valuable for navigating and modifying complex, large codebases where its semantic understanding provides significant advantages over simpler file-based approaches.

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "serena" '{"command":"uvx","args":["--from","git+https://github.com/oraios/serena","serena-mcp-server"]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "serena": {
            "command": "uvx",
            "args": [
                "--from",
                "git+https://github.com/oraios/serena",
                "serena-mcp-server"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "serena": {
            "command": "uvx",
            "args": [
                "--from",
                "git+https://github.com/oraios/serena",
                "serena-mcp-server"
            ]
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later