home / mcp / intercom mcp server
Queries Intercom data: conversations and tickets with server-side filtering for AI assistants.
Configuration
View docs{
"mcpServers": {
"raoulbia-ai-mcp-server-for-intercom": {
"command": "intercom-mcp",
"args": [],
"env": {
"INTERCOM_ACCESS_TOKEN": "YOUR_TOKEN_HERE"
}
}
}
}You run an MCP server that lets AI assistants query and analyze Intercom data, including conversations and tickets, with powerful server-side filtering. This enables seamless, scripted access to customer support data for automated workflows and intelligent assistants.
Use the Intercom MCP server with your MCP client to perform structured queries against your Intercom data. You can search conversations and tickets, filter by date ranges, customer identifiers, status, and keywords, and even retrieve content by email when there is no contact.
From your MCP client, choose one of the supported endpoints to fetch data. For example, you can pull all conversations within a date window with content filters, look up conversations by a specific customer, or retrieve tickets by status or by customer. Each operation returns relevant Intercom data that your assistant can analyze, summarize, or act upon in your workflow.
Prerequisites you need before installation are straightforward.
Node.js 18.0.0 or higher is required to run the MCP server.
An Intercom account with API access and your Intercom API token (found in your Intercom account settings).
# Install the package globally
npm install -g mcp-server-for-intercom
# Set your Intercom API token
export INTERCOM_ACCESS_TOKEN="your_token_here"
# Run the server
intercom-mcpThe Docker setup is optimized for Glama compatibility and can be run with your Intercom API token.
# Build the image
docker build -t mcp-intercom .
# Run the container with your API token and port mappings
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom:latestIf you prefer a lighter version without Glama-specific dependencies, use the standard Docker image.
# Build the standard image
docker build -t mcp-intercom-standard -f Dockerfile.standard .
# Run the standard container
docker run --rm -it -p 3000:3000 -p 8080:8080 -e INTERCOM_ACCESS_TOKEN="your_token_here" mcp-intercom-standard:latestThe default setup uses containerized deployment with a token-based authentication flow. You can validate the server status and MCP endpoint as part of your integration checks.
{
"mcpServers": {
"intercom-mcp": {
"command": "intercom-mcp",
"args": [],
"env": {
"INTERCOM_ACCESS_TOKEN": "your_intercom_api_token"
}
}
}
}If you are developing against this MCP server, you can run build and development commands to test locally, and run tests to ensure functionality after changes.
This MCP server is designed to be used by MCP-compatible AI assistants to access and analyze Intercom data. Ensure you protect your Intercom API token and follow security best practices when deploying to production.
Retrieves all conversations within a date range with content filtering, using server-side Intercom search filtering to optimize performance.
Finds conversations for a specific customer by their identifier (email or Intercom ID), with optional date range and keywords, enabling content-based search even when no contact exists.
Retrieves tickets filtered by status such as open, pending, or resolved, with optional date range.
Finds tickets associated with a specific customer, with optional date range.