Intelligent AI model search and discovery with zero-install simplicity
Configuration
View docs{
"mcpServers": {
"adawalli-nexus": {
"command": "bunx",
"args": [
"nexus-mcp"
],
"env": {
"OPENROUTER_API_KEY": "your-api-key-here"
}
}
}
}Nexus MCP Server provides AI-powered search capabilities through the MCP protocol, enabling you to discover and query AI model knowledge with zero-install deployment and robust production-ready features.
You use Nexus MCP Server by connecting an MCP-compatible client to its stdio interface. Once connected, you can perform queries against multiple AI model families, including real-time web search and training-data knowledge sources. Choose a model that matches your needs, adjust parameters like temperature and max tokens, and receive structured results with citations.
Prerequisites you need before starting include a modern runtime and a way to run MCP servers. You can run Nexus MCP Server with Bun or with Node.js via NPX. Follow these steps to get up and running quickly.
# Set your OpenRouter API key
export OPENROUTER_API_KEY=your-api-key-here
# Quick start using BunX (recommended)
bunx nexus-mcp
# Alternatively, run with NPX
npx nexus-mcpIf you prefer local development or customization, you can build from source and run a local server. This enables you to modify the codebase and tailor the MCP server to your needs.
# Clone the repository
git clone https://github.com/adawalli/nexus.git
cd nexus
# Install dependencies
bun install
# Build the server (if a build step exists in your version)
bun run build
# Copy example environment and set your API key
cp .env.example .env
# Edit .env to set OPENROUTER_API_KEY
# Start the server
bun run startConfigure the server with your API key and optional timeout or retry settings. Ensure your OpenRouter API key is valid and kept secure in your environment. The server supports configurable timeouts, retries, and a dedicated OpenRouter base URL if needed.
If you encounter connection or authentication issues, confirm your API key is set in the environment and that the MCP client is pointing to the correct command and arguments. Check startup logs for any initialization errors and verify network access to OpenRouter.
Developers can run the server with hot reload, execute tests, and lint/format code to maintain quality during local development.
Use the search tool to query AI models. You can select from multiple models such as fast web-enabled searches or training-data knowledge sources. Each model has its own timeout and capabilities, and you can customize max tokens and temperature for responses.
Key environment variables you will use include OPENROUTER_API_KEY for authentication, plus optional variables to tune timeouts and retries.
The MCP interface exposes a primary search tool and a configuration status endpoint that helps you verify server health and current settings.
The main search tool that queries AI models and returns results with citations. Configure model, maxTokens, temperature, and timeout to tailor responses.
Configuration status resource that shows server health, configuration details (with masked API key), available tools, uptime, and version.