Content
# nativeMCP
## Overview
This is an MCP (Model Context Protocol) system written in C++, which includes the host, client, and server of the MCP core architecture. The host itself is essentially an AI Agent, although currently, it only has a command line interface (but I don't think a fancy interface is necessary).
Reference: https://modelcontextprotocol.io/introduction
*The server can be used with other software that supports MCP, such as Cursor.*
*The client also supports the official MCP servers, and can be configured the same way as Cursor.*
## Basic Components
- MCPServer: The parent class of the MCP Server that follows the MCP protocol, equivalent to an SDK. Subclasses only need to inherit and add the required methods as needed; currently, the communication method only supports stdio.
- servers: Specific servers implemented by inheriting from MCPServer, which can be used directly; more tools will be updated in future branches.
- host: The host application that connects to the LLM (Large Language Model) and loads the MCPClient based on the configuration.
- MCPClient: Maintains a 1:1 connection with the server within the host application, calling the tools provided by MCPServer when needed by the LLM.
- ModelAdapter: The adaptation layer that connects to the LLM, currently implemented using cpp-httplib to make requests to Ollama.
## Dependency Libraries & External Programs
- **Qt6.8**: For string and JSON processing, meta-object mechanism (used for dynamically calling any function).
- **cpp-httplib**: Used for making requests to the Ollama or other online large model APIs.
- **Ollama**: The backend program for locally deploying large models.
- LLM: Any large language model used for natural language question answering, such as DeepSeek, Qwen, etc.
## Build Environment
### Build Toolchain
- CMake >= 3.30
- Visual Studio 2022 (check `Desktop development with C++`)
### vcpkg
Used to import third-party libraries cpp-httplib and spdlog.
```
git clone https://github.com/microsoft/vcpkg.git
cd vcpkg && bootstrap-vcpkg.bat
```
Then add the path of vcpkg.exe to the system environment variable PATH.
### Qt6.8.2
You can use the online installer directly, but it is recommended to download the source code to build it yourself. Currently, you only need to compile the qt-base module.
#### References
https://doc.qt.io/qt-6/getting-sources-from-git.html
https://doc.qt.io/qt-6/windows-building.html
https://doc.qt.io/qt-6/configure-options.html
#### Download Source Code
`git clone --branch v6.8.2 git://code.qt.io/qt/qt5.git .\src`
#### Build
```
mkdir build
cd build
..\src\configure.bat -init-submodules -submodules qtbase
..\src\configure.bat -debug-and-release -prefix <path-to-qt>
cmake --build . --parallel
cmake --install .
cmake --install . --config debug
```
*The above steps include two installs; the first one defaults to installing only the release version.*
Finally, add `<path-to-qt>\bin` to the system environment variable PATH.
## Build
Clone the source code and navigate to the source directory.
```cmd
mkdir build
cd build
cmake .. -G "Visual Studio 17 2022"
```
Finally, open build/nativeMCP.sln for compilation and debugging (make sure to set the host as the startup item, or if you want to debug the server, set the corresponding server as the startup item).
## Configuration
Modify [config.json](./host/config.json) to easily configure the functions of the MCP Host.
- api
- url: The complete URL of the API.
- model: The model for generating conversations, recommended deepseek or qwen.
- api_key: Leave empty if deployed locally.
- mcpServers
- Server name (do not use colons as the code checks for them).
- command: Must be `cmd` on Windows systems.
- args: The first parameter must be `"/c"`, the second parameter depends on the type of server; for an exe, it is simply `path to exe`, for a Python program, fill in `python`, for Node.js, fill in `npx`, and the subsequent parameters should be filled in according to the server's documentation.
## Example
```text
[2025-03-26 12:27:30.296][info][Host.cpp::18] MCP Host initialized
[2025-03-26 12:27:30.299][info][ModelAdapter.cpp::22] ModelAdapter initialized: http://localhost:11434, qwen2.5:7b
[2025-03-26 12:27:30.397][info][Host.cpp::67]
Available tool list:
cpp-time:
getCurrentTime Get the current time
waitTime Wait for a specified time
server-test:
getAvailableIP Get the list of available IPs
sendToIP Send content to the specified IP address
testMultiParams Test multi-parameter tool calls
>>> Get the current time and send it to all available IPs
[2025-03-26 12:29:08.349][info][Host.cpp::123] Calling MCP tool [cpp-time::getCurrentTime]: {
"datetime": "2025-03-26T12:29:08.346",
"timezone": "Asia/Shanghai"
}
[2025-03-26 12:29:20.160][info][Host.cpp::123] Calling MCP tool [server-test::getAvailableIP]: {
"ip_list": [
"192.168.1.201",
"192.168.1.202",
"192.168.1.203"
]
}
[2025-03-26 12:30:07.784][info][Host.cpp::123] Calling MCP tool [server-test::sendToIP]: {
"content": "The current time is: 2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.201",
"status": "Sent successfully"
}
[2025-03-26 12:30:07.788][info][Host.cpp::123] Calling MCP tool [server-test::sendToIP]: {
"content": "The current time is: 2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.202",
"status": "Sent successfully"
}
[2025-03-26 12:30:07.789][info][Host.cpp::123] Calling MCP tool [server-test::sendToIP]: {
"content": "The current time is: 2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.203",
"status": "Sent successfully"
}
The current time is: 2025-03-26T12:29:08.346 (Asia/Shanghai)
The current time has been sent to all available IP addresses, and the sending status is successful.
```