any-chat-completions-mcp MCP Server
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io
This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.
It has one tool, chat
which relays a question to a configured AI Chat Provider.
Development
Install dependencies:
1npm install
Build the server:
1npm run build
For development with auto-rebuild:
1npm run watch
Installation
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
1 2{ 3 "mcpServers": { 4 "chat-openai": { 5 "command": "node", 6 "args": [ 7 "/path/to/any-chat-completions-mcp/build/index.js" 8 ], 9 "env": { 10 "AI_CHAT_KEY": "OPENAI_KEY", 11 "AI_CHAT_NAME": "OpenAI", 12 "AI_CHAT_MODEL": "gpt-4o", 13 "AI_CHAT_BASE_URL": "https://api.openai.com/v1" 14 } 15 } 16 } 17}
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
1 2{ 3 "mcpServers": { 4 "chat-pyroprompts": { 5 "command": "node", 6 "args": [ 7 "/path/to/any-chat-completions-mcp/build/index.js" 8 ], 9 "env": { 10 "AI_CHAT_KEY": "PYROPROMPTS_KEY", 11 "AI_CHAT_NAME": "PyroPrompts", 12 "AI_CHAT_MODEL": "ash", 13 "AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1" 14 } 15 }, 16 "chat-perplexity": { 17 "command": "node", 18 "args": [ 19 "/path/to/any-chat-completions-mcp/build/index.js" 20 ], 21 "env": { 22 "AI_CHAT_KEY": "PERPLEXITY_KEY", 23 "AI_CHAT_NAME": "Perplexity", 24 "AI_CHAT_MODEL": "llama-3.1-sonar-small-128k-online", 25 "AI_CHAT_BASE_URL": "https://api.perplexity.ai" 26 } 27 }, 28 "chat-openai": { 29 "command": "node", 30 "args": [ 31 "/path/to/any-chat-completions-mcp/build/index.js" 32 ], 33 "env": { 34 "AI_CHAT_KEY": "OPENAI_KEY", 35 "AI_CHAT_NAME": "OpenAI", 36 "AI_CHAT_MODEL": "gpt-4o", 37 "AI_CHAT_BASE_URL": "https://api.openai.com/v1" 38 } 39 } 40 } 41}
With these three, you'll see a tool for each in the Claude Desktop Home:
And then you can chat with other LLMs and it shows in chat like this:
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
1npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Acknowledgements
- Obviously the modelcontextprotocol and Anthropic team for the MCP Specification and integration into Claude Desktop. https://modelcontextprotocol.io/introduction
- PyroPrompts for sponsoring this project. Use code
CLAUDEANYCHAT
for 20 free automation credits on Pyroprompts.