home / mcp / aas lancedb mcp server
Provides a database-like interface over LanceDB with automatic BGE-M3 embeddings for semantic search
Configuration
View docs{
"mcpServers": {
"applied-ai-systems-aas-lancedb-mcp": {
"command": "aas-lancedb-mcp",
"args": [
"--db-uri",
"~/my_database"
],
"env": {
"LANCEDB_URI": "./my_database",
"EMBEDDING_MODEL": "BAAI/bge-m3",
"EMBEDDING_DEVICE": "cpu"
}
}
}
}You deploy a dedicated MCP server that exposes a database-like interface over LanceDB, with automatic multilingual embeddings via BGE-M3. This enables AI agents to create tables, perform CRUD operations, run semantic searches, inspect schemas and data, and safely migrate schemas, all with natural language and programmatic tooling capabilities.
You interact with the MCP server through a client that issues predefined tools to manage and query your LanceDB-powered data. Use the 10 database tools to create and modify tables, insert data (embeddings are generated automatically), perform semantic searches, and safely migrate schemas. You can inspect dynamic resources for real-time state, and you can rely on AI prompts to design schemas, optimize queries, troubleshoot performance, and plan migrations.
Prerequisites you need before installing include a runtime environment for running the MCP server and the MCP client tooling. The setup uses the UV-based toolchain to install and run the MCP server.
Step 1. Install the MCP client tooling (recommended):
uv tool install aas-lancedb-mcp
aas-lancedb-mcp --versionStep 2. Run the MCP server locally (for development) and pass the database URI you want to use. The server runs in the current environment and respects embedding model configuration.
Step 3. If you prefer to start without installation, you can run directly with the help option to verify usage:
uvx aas-lancedb-mcp --helpStep 4. If you are installing from source, clone the repository and install dependencies, then start the MCP server using the provided runtime command:
git clone https://github.com/applied-ai-systems/aas-lancedb-mcp.git
cd aas-lancedb-mcp
uv tool install .Step 5. Start the server with a database URI and embedding model setting (example shows a common runtime command):
aas-lancedb-mcp --db-uri ~/my_databaseEnvironment variables and runtime options help customize the server to your needs. The following configuration is commonly used to point to your LanceDB location and select the embedding model.
export LANCEDB_URI="./my_database"
export EMBEDDING_MODEL="BAAI/bge-m3"
export EMBEDDING_DEVICE="cpu"Create tables with a schema including column definitions and optional metadata, enabling searchable fields where embeddings will be generated automatically for text columns.
Return an overview of all tables in the LanceDB-backed MCP data store.
Provide detailed schema information for a specific table, including column types and constraints.
Remove a table and all its data from the MCP store.
Add new rows to a table; text fields are automatically embedded for semantic search.
Query data with filters and sorting to retrieve matching rows.
Modify existing rows; updated text fields trigger new embeddings as needed.
Delete rows that meet specified conditions.
Perform semantic text search across searchable fields using vector embeddings and similarity.
Apply safe schema changes with validation and optional backups.