home / mcp / ai_sync mcp server

AI_SYNC MCP Server

Serves as the MCP bridge between OpenAI GPT-4 and a local Merchant API for querying and updating Tnc_Store data.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "baonpnexle-mcp_ai_sync": {
      "url": "http://localhost:4001",
      "headers": {
        "OPENAI_API_KEY": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
      }
    }
  }
}

You can use MCP with OpenAI to query and update data in a merchant store via a local backend API. This setup lets GPT-4 interact with the Tnc_Store class through defined endpoints, while running all components on your machine for a private, offline-ready workflow.

How to use

Start your local MCP workflow by preparing the backend API and the MCP client. First, ensure you have your OpenAI API key available in a secure place. Then run the local Merchant API server and launch the MCP tooling client so you can ask questions and perform actions against the merchant data.

- The backend API for MCP runs locally at http://localhost:4001. It exposes endpoints to find and manage stores under the Tnc_Store merchant class. You will interact with endpoints like /MerchantStore/findAllStores, /MerchantStore/findStore, and /MerchantStore/addNewStore through the MCP tooling layer.

- Launch the MCP tooling system to connect your client to the tool server. The command you use is: uv run client.py server.py. This starts the client that talks to OpenAI and the tool server that exposes MCP endpoints for the AI to route queries to the backend.

How to install

Prerequisites: You need Python installed and the uv tool available on your system.

# 1) Clone the project repository
git clone https://github.com/your-org/ai_sync
cd ai_sync

# 2) Create a .env file in the project root with your OpenAI API key
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

3) Set up the Python environment with uv. If you don’t have uv, install it with the following script.

curl -Ls https://astral.sh/uv/install.sh | sh
```

Then create and activate a virtual environment, which will install dependencies automatically based on the uv.lock file.
uv venv .venv
source .venv/bin/activate

4) Start the Merchant API server locally so the MCP tooling can communicate with it.

# The API runs on http://localhost:4001

5) Run the tooling system to connect the client to the tool server.

uv run client.py server.py

Additional notes

- The local MCP setup is designed to work with the Tnc_Store merchant class. All interactions target data under this class.

- The OpenAI API key is required to enable GPT-4 access through the MCP tooling layer. Keep this key secure and do not expose it in public environments.

- All components run locally, enabling private data handling and offline-ready workflows when needed.

Available tools

findAllStores

Lists all stores associated with the Tnc_Store merchant class, enabling you to view the full store catalog.

findStore

Performs a natural language search to locate stores that match your query and criteria.

addNewStore

Adds a new store with a full description and associated tags to the Tnc_Store catalog.