home / mcp / zerodb mcp server
AINative ZeroDB MCP Server
Configuration
View docs{
"mcpServers": {
"ainative-studio-ainative-zerodb-mcp-server": {
"command": "npx",
"args": [
"ainative-zerodb-mcp-server"
],
"env": {
"ZERODB_API_URL": "https://api.ainative.studio",
"ZERODB_PASSWORD": "YourPassword123!",
"ZERODB_USERNAME": "[email protected]",
"ZERODB_API_TOKEN": "<ZERODB_API_TOKEN>",
"ZERODB_PROJECT_ID": "your-project-id",
"MCP_CONTEXT_WINDOW": "8192",
"MCP_RETENTION_DAYS": "30"
}
}
}
}You run an Enterprise MCP Server that exposes ZeroDB vector search, SQL and NoSQL operations, file storage, and agent memory features to your applications. With a single MCP endpoint, you connect clients to perform memory, vector, table, file, event, and admin operations in a scalable, secure way, all under a unified workflow.
You interact with the MCP server through an MCP client by configuring a local or remote MCP connection that points to the server. The server provides tools to store and search agent memory, upload and manage files, execute vector operations, query NoSQL tables, run PostgreSQL queries, and perform enterprise tasks like project management and RLHF data collection. Use the client to invoke tools such as zerodb_store_memory, zerodb_search_vectors, zerodb_store_vector, zerodb_create_table, zerodb_postgres_query, zerodb_rlhf_start, zerodb_admin_health, and many more. Your applications authenticate with a JWT-based flow, refresh tokens automatically, and you can renew tokens manually if needed.
# Prerequisites
# - Node.js >= 18.0.0
# - npm >= 9.0.0
# Global installation (preferred for quick start)
npm install -g ainative-zerodb-mcp-server
# Or run directly without installation
npx ainative-zerodb-mcp-serverConfigure the MCP connection to point to your ZeroDB project and credentials. The server expects environment variables for API access and project context. In your client configuration, provide the project ID, user credentials, and API URL you use to access ZeroDB.
{
"mcpServers": {
"zerodb": {
"command": "npx",
"args": ["ainative-zerodb-mcp-server"],
"env": {
"ZERODB_API_URL": "https://api.ainative.studio",
"ZERODB_PROJECT_ID": "your-project-id-here",
"ZERODB_USERNAME": "[email protected]",
"ZERODB_PASSWORD": "YourPassword123!",
"MCP_CONTEXT_WINDOW": "8192",
"MCP_RETENTION_DAYS": "30"
}
}
}
}The MCP server uses JWT authentication with automatic token renewal. Tokens are renewed on a schedule and can be renewed manually if needed. Protect credentials by keeping environment variables secure and using project-scoped access where possible. Rotate passwords periodically and monitor usage via project statistics.
Here are practical usage scenarios you can adapt in your client code. Store a memory entry for persistent context, search memories semantically, manage vectors for a knowledge base, and perform SQL-backed queries against PostgreSQL.
Common issues include authentication failures, misconfigured project IDs, vector dimension mismatches, expired tokens, and rate limits. Ensure credentials are correct, the project ID is set, vectors use the required 1536 dimensions, tokens renew automatically, and batch requests where possible to avoid rate limits.
This MCP server ships with a comprehensive test suite and security features. Leverage the admin tools to monitor health, review usage, and optimize performance. When upgrading, verify that your environment variables and project contexts remain consistent to avoid interruptions.
When upgrading from earlier versions, you will typically provide the same project context and credentials, as well as any new options introduced by the upgrade. Validate token renewal behavior and ensure all tools you rely on maintain backward compatibility.
Store agent memory in ZeroDB for persistent context across sessions.
Search agent memory using semantic similarity.
Get agent context window for the current session, optimized for token limits.
Store a 1536-dimensional vector embedding with metadata.
Batch upsert multiple vectors for improved performance.
Search vectors by semantic similarity.
Delete a specific vector by its ID.
Retrieve a specific vector by its ID.
List vectors with pagination and filtering.
Get statistics about vector storage.
Create a vector search index for improved performance.
Optimize vector storage and indexes.
Export vectors to an external format.
Compress a vector using quantum algorithms.
Decompress quantum-compressed vector data.
Perform hybrid search using quantum-enhanced similarity.
Optimize vector space with quantum techniques.
Generate a quantum feature map for a vector.
Compute quantum kernel similarity between vectors.
Create a new NoSQL table with schema.
List all NoSQL tables in the project.
Get details of a NoSQL table.
Delete a NoSQL table and its data.
Insert rows into a NoSQL table.
Query rows in a NoSQL table with filters.
Update rows in a NoSQL table.
Delete rows from a NoSQL table.
Upload a file to ZeroDB storage.
Download a file from ZeroDB storage.
List files stored in ZeroDB.
Delete a file from storage.
Get file metadata without downloading.
Generate a temporary presigned download URL for a file.
Create a new event in the event system.
List events with filtering and pagination.
Get a specific event by ID.
Subscribe to an event stream over WebSocket.
Get event statistics and analytics.
Create a new ZeroDB project.
Get details of a project.
List all accessible projects.
Update project settings.
Delete a project and its data.
Get project usage statistics.
Enable database features for a project.
Collect user interaction data for RLHF training.
Collect agent performance feedback.
Collect RLHF workflow feedback.
Collect error reports for model improvement.
Get RLHF collection status.
Get RLHF analytics summary.
Start RLHF data collection for a session.
Stop RLHF data collection for a session.
Get all RLHF interactions for a session.
Broadcast RLHF events to subscribers.
Get system-wide statistics (admin).
List all projects in the system (admin).
Get user usage statistics (admin).
Get system health status (admin).
Optimize database and storage (admin).
Execute SQL queries on the provisioned PostgreSQL instance.
Get detailed PostgreSQL schema information.
Create a new PostgreSQL table with complete schema.
Trigger a PostgreSQL backup job.
Restore PostgreSQL from a backup.
Get PostgreSQL database statistics.