home / mcp / cocktails mcp server
Delivers cocktail recommendations using a RAG pipeline with local FAISS indexing and Groq-powered LLMs.
Configuration
View docs{
"mcpServers": {
"00200200-cocktails-rag-mcp": {
"command": "uv",
"args": [
"run",
"--with",
"faiss-cpu",
"--with",
"fastmcp",
"--with",
"jq",
"--with",
"langchain",
"--with",
"langchain-community",
"--with",
"langchain-groq",
"--with",
"langchain-huggingface",
"--with",
"pandas",
"--with",
"python-dotenv",
"--with",
"sentence-transformers",
"fastmcp",
"run",
"/ABSOLUTE/PATH/TO/src/mcp/server.py:mcp"
],
"env": {
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}You can run a dedicated MCP server that uses Retrieval-Augmented Generation to recommend cocktails. This server exposes an MCP endpoint you can connect to with your MCP client, handles data through a local vector store, and uses Groqβs API for language modeling. Itβs designed to help you get quick, accurate cocktail suggestions built from a local dataset with efficient retrieval and ranking.
To use the Cocktails RAG MCP Server, connect your MCP client to the local server instance. Run the MCP in stdio mode to start a local server process that listens for requests from your client and returns cocktail recommendations powered by a RAG pipeline.
Prerequisites you need on your machine before installation:
Step-by-step commands to set up and run the server locally:
# Optional: clone the project repository
# git clone https://github.com/00200200/cocktails-rag-mcp.git
# cd cocktails-rag-mcp
# Copy environment template and edit GROQ key
cp .env.example .env
# Edit .env to add GROQ_API_KEY
nano .env
# Install dependencies via UV
uv sync
# Pre-download models (embeddings and reranker)
uv run python -c "from src.rag.rag import RAG; RAG(); print('Models downloaed!')"
# Start the MCP server locally in stdio mode
uv run python src/mcp/server.pyIf you want to run the server in a pre-configured environment for Claude Desktop or a specific setup, you can adopt the provided MCP JSON configuration to register the server with your client. The following configuration demonstrates a stdio MCP setup using UV and FastMCP, including the required environment variable for GROQ.
{
"mcpServers": {
"cocktails": {
"command": "uv",
"args": [
"run",
"--with","faiss-cpu",
"--with","fastmcp",
"--with","jq",
"--with","langchain",
"--with","langchain-community",
"--with","langchain-groq",
"--with","langchain-huggingface",
"--with","pandas",
"--with","python-dotenv",
"--with","sentence-transformers",
"fastmcp",
"run",
"/ABSOLUTE/PATH/TO/src/mcp/server.py:mcp"
],
"env": {
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}Orchestrates retrieval from a FAISS vector store, reranking of candidate results, and generation of natural language cocktail recommendations.
Loads local embeddings for cocktail data to enable semantic search within the RAG workflow.
Manages the local FAISS index for fast vector similarity search against cocktail data.