Content
# Project Introduction
LLM Calls MCP
Run:
```bash
pip install uv
uv sync
uv run xxx.py
```
## 1. Calling MCP Based on the python-openai Module
Implementation Effect

## 2. Calling MCP Based on Langchain

# Calling Principle
## 1. Constructing LLM Tool Calls to MCP
1. First, pass the question and all tools into the LLM, allowing the LLM to choose a tool.
2. After the LLM selects a tool, pass the tool's name and parameters to MCP for execution.
3. MCP calls the tool based on the tool's name and parameters and returns the result.
4. Then, pass the question and the result returned by MCP into the LLM, allowing the LLM to provide the final response.
> Some models do not support tool calls, so please be aware.
## 2. Calling MCP Based on Prompts
1. Write all MCP tools and parameters into the system prompt.
2. Let the LLM return the corresponding MCP that needs to be called.
3. Call MCP and return the result.
4. Pass the user question + the result returned by MCP into the LLM, allowing the LLM to provide the final response.
> This method is compatible with all models and does not require model support for tool calls.
`openai_prompt_invoke.py` mimics this approach to call MCP.
Connection Info
You Might Also Like
markitdown
MarkItDown-MCP is a lightweight server for converting URIs to Markdown.
everything-claude-code
Complete Claude Code configuration collection - agents, skills, hooks,...
servers
Model Context Protocol Servers
Time
A Model Context Protocol server for time and timezone conversions.
Filesystem
Node.js MCP Server for filesystem operations with dynamic access control.
Sequential Thinking
A structured MCP server for dynamic problem-solving and reflective thinking.