Content
# nativeMCP
## Overview
This is an MCP system written in C++, including the host, client, and server of the MCP core architecture; the host itself is essentially an AI Agent, although it currently only has a command line interface (but I don't think a fancy interface is necessary).
Reference: https://modelcontextprotocol.io/introduction
*server can be used for other MCP-supporting software, such as Cursor*
*client also supports MCP official servers, with the same configuration as Cursor*
## Basic Components
- MCPServer: Follows the MCP Server parent class, equivalent to an SDK. Subclasses only need to inherit and add the required methods as needed; currently, only stdio communication is supported.
- servers: Some specific servers implemented after inheriting MCPServer, which can be used directly; more tools will be updated in subsequent branches.
- host: The host application, connects to the LLM, and loads MCPClient according to the configuration.
- MCPClient: Maintains a 1:1 connection with the server inside the host application, and calls the tools provided by MCPServer when the LLM needs it.
- ModelAdapter: The adaptation layer for connecting to the LLM. The current implementation uses cpp-httplib to initiate requests to Ollama.
## Dependencies & External Programs
- **Qt6.8**: String, Json processing, meta-object mechanism (for dynamically calling arbitrary functions)
- **cpp-httplib**: Used to request Ollama or other online large model APIs
- **Ollama**: Backend program for locally deploying large models
- LLM Any large language model for natural language question answering, such as DeepSeek, Qwen, etc.
## Compilation Environment
### Compilation Toolchain
- CMake >= 3.30
- Visual Studio 2022 (check `Desktop development with C++`)
### vcpkg
Used to import third-party libraries cpp-httplib, spdlog
```
git clone https://github.com/microsoft/vcpkg.git
cd vcpkg && bootstrap-vcpkg.bat
```
Then add the path of vcpkg.exe to the system environment variable PATH.
### Qt6.8.2
You can directly use the online installer, but it is still recommended to download the source code to build it yourself. Currently, only the qt-base module needs to be compiled.
#### Reference
https://doc.qt.io/qt-6/getting-sources-from-git.html
https://doc.qt.io/qt-6/windows-building.html
https://doc.qt.io/qt-6/configure-options.html
#### Download Source Code
`git clone --branch v6.8.2 git://code.qt.io/qt/qt5.git .\src`
#### Generation
```
mkdir build
cd build
..\src\configure.bat -init-submodules -submodules qtbase
..\src\configure.bat -debug-and-release -prefix <path-to-qt>
cmake --build . --parallel
cmake --install .
cmake --install . --config debug
```
*The above steps have two installs, the first one by default only installs release*
Finally, add `<path-to-qt>\bin` to the system environment variable PATH.
## Build
Clone the source code and enter the source code directory
```cmd
mkdir build
cd build
cmake .. -G "Visual Studio 17 2022"
```
Finally, open build/nativeMCP.sln for compilation and debugging (note that you must first set host as the startup item, or if you want to debug the server, set the corresponding server as the startup item)
## Configuration
Modify [config.json](./host/config.json) to simply configure the functions of MCP Host
- api
- url: The complete url of the api
- model: The model for generating conversations, recommended deepseek or qwen
- api_key: Leave blank if it is a locally deployed model
- mcpServers
- server name (do not use colons because the code will check for colons)
- command: Must be `cmd` on Windows systems
- args: ["/c"] The first parameter must be `/c`, the second parameter depends on the type of server, exe is directly `the path of the exe`, python program fills in `python`, Node.js fills in `npx`, and the subsequent parameters are filled in according to the server's instructions
## Example
```text
[2025-03-26 12:27:30.296][info][Host.cpp::18] MCP Host initialization
[2025-03-26 12:27:30.299][info][ModelAdapter.cpp::22] ModelAdapter initialization:http://localhost:11434, qwen2.5:7b
[2025-03-26 12:27:30.397][info][Host.cpp::67]
Available tools list:
cpp-time:
getCurrentTime Get current time
waitTime Wait for the specified time
server-test:
getAvailableIP Get available IP list
sendToIP Send content to the specified ip address
testMultiParams Test multi-parameter tool call
>>> Get the current time and send it to all available ips
[2025-03-26 12:29:08.349][info][Host.cpp::123] Calling MCP tool[cpp-time::getCurrentTime]: {
"datetime": "2025-03-26T12:29:08.346",
"timezone": "Asia/Shanghai"
}
[2025-03-26 12:29:20.160][info][Host.cpp::123] Calling MCP tool[server-test::getAvailableIP]: {
"ip_list": [
"192.168.1.201",
"192.168.1.202",
"192.168.1.203"
]
}
[2025-03-26 12:30:07.784][info][Host.cpp::123] Calling MCP tool[server-test::sendToIP]: {
"content": "当前时间为:2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.201",
"status": "发送成功"
}
[2025-03-26 12:30:07.788][info][Host.cpp::123] Calling MCP tool[server-test::sendToIP]: {
"content": "当前时间为:2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.202",
"status": "发送成功"
}
[2025-03-26 12:30:07.789][info][Host.cpp::123] Calling MCP tool[server-test::sendToIP]: {
"content": "当前时间为:2025-03-26T12:29:08.346 (Asia/Shanghai)",
"ip": "192.168.1.203",
"status": "发送成功"
}
当前时间为:2025-03-26T12:29:08.346 (Asia/Shanghai)
已经将当前时间发送给所有可用的IP地址,发送状态均为成功。
```
Connection Info
You Might Also Like
markitdown
MarkItDown-MCP is a lightweight server for converting URIs to Markdown.
servers
Model Context Protocol Servers
Time
A Model Context Protocol server for time and timezone conversions.
Filesystem
Node.js MCP Server for filesystem operations with dynamic access control.
Sequential Thinking
A structured MCP server for dynamic problem-solving and reflective thinking.
git
A Model Context Protocol server for Git automation and interaction.