Please visit the Contextual AI MCP Server README on GitHub for more information.
Overview
An MCP server acts as a bridge between AI interfaces (Cursor IDE or Claude Desktop) and a specialized Contextual AI agent. It enables: Query Processing: Direct your domain specific questions to a dedicated Contextual AI agent Intelligent Retrieval: Searches through comprehensive information in your knowledge base Context-Aware Responses: Generates answers that are: Grounded in source documentation Include citations and attributions Maintain conversation contextIntegration Flow
This guide walks through integration with both the Cursor IDE and Claude Desktop.Prerequisites
- Python 3.10 or higher
- Cursor IDE and/or Claude Desktop
- Contextual AI API key
- MCP-compatible environment
Installation
Clone the repository:Configuration
Configure MCP Server
The server requires modifications of settings or use. For example, the single_agent server should be customized with an appropriate docstring for your RAG Agent. The docstring for your query tool is critical as it helps the MCP client understand when to route questions to your RAG agent. Make it specific to your knowledge domain. Here is an example: A research tool focused on financial data on the largest US firms or A research tool focused on technical documents for Omaha semiconductors The server also requires the following settings from your RAG Agent: API_KEY: Your Contextual AI API key AGENT_ID: Your Contextual AI agent ID If you’d like to store these files in .env file you can specify them like so:.cursor/mcp.json in your project directory
Global: ~/.cursor/mcp.json for system-wide access
For Claude Desktop:
Use the same configuration file format in the appropriate Claude Desktop configuration directory
Environment Setup
This project uses uv for dependency management, which provides faster and more reliable Python package installation.
Usage
The server provides Contextual AI RAG capabilities using the python SDK, which can available a variety of commands accessible from MCP clients, such as Cursor IDE and Claude Desktop. The current server focuses on using the query command from the Contextual AI python SDK, however you could extend this to support other features such as listing all the agents, updating retrieval settings, updating prompts, extracting retrievals, or downloading metrics.
Example Usage
In Cursor, you might ask:
“Show me the code for initiating the RF345 microchip?”The MCP client will:
- Determine if this should be routed to the MCP Server
Then the MCP server will:
- Route the query to the Contextual AI agent
- Retrieve relevant documentation
- Generate a response with specific citations
- Return the formatted answer to Cursor Key Benefits Accurate Responses: All answers are grounded in your documentation Source Attribution: Every response includes references to source documents Context Awareness: The system maintains conversation context for follow-up questions Real-time Updates: Responses reflect the latest documentation in your datastore Development Modifying the Server To add new capabilities:
Limitations
- The server runs locally and may not work in remote development environments
- Tool responses are subject to Contextual AI API limits and quotas
- Currently only supports stdio transport mode