home / mcp / aas lancedb mcp server

AAS LanceDB MCP Server

Provides a database-like interface over LanceDB with automatic BGE-M3 embeddings for semantic search

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "applied-ai-systems-aas-lancedb-mcp": {
      "command": "aas-lancedb-mcp",
      "args": [
        "--db-uri",
        "~/my_database"
      ],
      "env": {
        "LANCEDB_URI": "./my_database",
        "EMBEDDING_MODEL": "BAAI/bge-m3",
        "EMBEDDING_DEVICE": "cpu"
      }
    }
  }
}

You deploy a dedicated MCP server that exposes a database-like interface over LanceDB, with automatic multilingual embeddings via BGE-M3. This enables AI agents to create tables, perform CRUD operations, run semantic searches, inspect schemas and data, and safely migrate schemas, all with natural language and programmatic tooling capabilities.

How to use

You interact with the MCP server through a client that issues predefined tools to manage and query your LanceDB-powered data. Use the 10 database tools to create and modify tables, insert data (embeddings are generated automatically), perform semantic searches, and safely migrate schemas. You can inspect dynamic resources for real-time state, and you can rely on AI prompts to design schemas, optimize queries, troubleshoot performance, and plan migrations.

How to install

Prerequisites you need before installing include a runtime environment for running the MCP server and the MCP client tooling. The setup uses the UV-based toolchain to install and run the MCP server.

Step 1. Install the MCP client tooling (recommended):

uv tool install aas-lancedb-mcp
aas-lancedb-mcp --version

Step 2. Run the MCP server locally (for development) and pass the database URI you want to use. The server runs in the current environment and respects embedding model configuration.

Step 3. If you prefer to start without installation, you can run directly with the help option to verify usage:

uvx aas-lancedb-mcp --help

Step 4. If you are installing from source, clone the repository and install dependencies, then start the MCP server using the provided runtime command:

git clone https://github.com/applied-ai-systems/aas-lancedb-mcp.git
cd aas-lancedb-mcp
uv tool install .

Step 5. Start the server with a database URI and embedding model setting (example shows a common runtime command):

aas-lancedb-mcp --db-uri ~/my_database

Configuration and runtime environment

Environment variables and runtime options help customize the server to your needs. The following configuration is commonly used to point to your LanceDB location and select the embedding model.

export LANCEDB_URI="./my_database"
export EMBEDDING_MODEL="BAAI/bge-m3"
export EMBEDDING_DEVICE="cpu"

Available tools

create_table

Create tables with a schema including column definitions and optional metadata, enabling searchable fields where embeddings will be generated automatically for text columns.

list_tables

Return an overview of all tables in the LanceDB-backed MCP data store.

describe_table

Provide detailed schema information for a specific table, including column types and constraints.

drop_table

Remove a table and all its data from the MCP store.

insert

Add new rows to a table; text fields are automatically embedded for semantic search.

select

Query data with filters and sorting to retrieve matching rows.

update

Modify existing rows; updated text fields trigger new embeddings as needed.

delete

Delete rows that meet specified conditions.

search

Perform semantic text search across searchable fields using vector embeddings and similarity.

migrate_table

Apply safe schema changes with validation and optional backups.