home / mcp / open ai21 mcp server
Provides access to the Open Ai21 API via an MCP interface with zero-config options and multiple runtime methods.
Configuration
View docs{
"mcpServers": {
"bach-ai-tools-bachai-open-ai21": {
"command": "python",
"args": [
"server.py"
],
"env": {
"API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}You run an MCP server that exposes access to the Open Ai21 API. It uses a standard MCP transport and lets your applications send requests to the Open Ai21 service through a simple, configurable interface.
You can use an MCP client to connect to this server by referencing its standard MCP endpoint configuration. The server runs locally or remotely as a stdio MCP endpoint, so you can start it from your development environment and point your MCP client to it. When you start the server, provide your API key so requests are authenticated.
In practical terms you will: set up an MCP client to load the bach-open_ai21 configuration, ensure your API key is available as an environment variable, and then issue requests through the client to the Bing OpenAi21-based endpoints exposed by the server.
Prerequisites: you need Python and a working runtime for Python packages, or you can run the server with the recommended MCP runner.
pip install bach-open_ai21
```
```
pip install -e .Run the server in one of the following ways.
# Preferred: use uvx to run without a local install
uvx --from bach-open_ai21 bach_open_ai21
# Or specify a version explicitly
uvx --from bach-open_ai21@latest bach_open_ai21Alternative development mode (not recommended for production): start the Python module directly.
python server.pyIf you install the package and want to run the command directly after installation, use the command name that mirrors the MCP, with an underscore.
bach_open_ai21API authentication is required. Provide your API key through an environment variable so the server can authenticate requests.
Environment variables shown for this server include API_KEY, used to authenticate requests to the Open Ai21 API.
Common configuration snippets for clients (Cursor and Claude Desktop) are shown below to illustrate how to connect your MCP client to this server.
{
"mcpServers": {
"bach-open_ai21": {
"command": "uvx",
"args": ["--from", "bach-open_ai21", "bach_open_ai21"],
"env": {
"API_KEY": "your_api_key_here"
}
}
}
}{
"mcpServers": {
"bach-open_ai21": {
"command": "uvx",
"args": ["--from", "bach-open_ai21", "bach_open_ai21"],
"env": {
"API_KEY": "your_api_key_here"
}
}
}
}Treat your API key as a secret. Do not commit it to version control. Use per-environment keys and rotate them as needed.
If you are running multiple MCP servers, centralize your API keys and credentials in your MCP platform’s secret store and reference them in each server’s environment configuration.
Talk to Llama 3 model via the conversation endpoint at POST /conversationllama.
Obtain image URLs via the POST /getimgurl endpoint.
Remove image backgrounds via the POST /bgremover endpoint.
Check service responsiveness via GET / endpoint.
Interface with Claude 3 model via POST /claude3.
Interact with the MATA G AI chatbot via POST /conversationgpt35.
Retrieve bot details via POST /getbotdetails.
Chat with a selected bot via POST /chatbotapi.
Generate high-quality images from text via POST /texttoimage2.
Access ChatGPT API via POST /chatgpt.
Convert text to speech via POST /texttospeech.
Answer questions based on provided context via POST /qa.
Summarize long documents via POST /summary.