Content
Mini NanoBot
**Mini NanoBot** is a lightweight **LLM Agent framework** implemented in Python for exploring and practicing the core architectural design of modern AI Agents.
The project implements a complete Agent system, including:
* Large model invocation
* Tool Calling
* Skill plugin system
* MCP (Model Context Protocol) tool access
* Memory
* Multi-channel input (CLI / Feishu)
The goal of this project is to **implement a clear, extensible Agent architecture with minimal code** to help understand the core working mechanisms of AI Agents.
---
# Project Features
**Modular Architecture**
Decompose the Agent into:
* Provider (model provider)
* Tool (tool system)
* Skill (plugin capability)
* MCP (remote tool)
* Memory (memory system)
* Channel (interaction channel)
Each module is designed independently for easy expansion and maintenance.
---
**Skill Plugin System**
Use
```
skill.yaml + run.py
```
to quickly extend Agent capabilities without modifying the core code.
---
**MCP Support**
Supports access to remote tool servers via **MCP (Model Context Protocol)** to achieve:
* Dynamically load tools
* Tool service decoupling
* Remote capability extension
---
**Multi-Channel Support**
Currently supported:
* CLI command line chat
* Feishu Robot
Future scalable:
* Web
* Slack
* Telegram
---
# System Architecture
The overall system architecture is as follows:
```
用户
│
▼
Channel层
(CLI / Feishu Bot)
│
▼
Agent
│
┌──────────┼──────────┐
│ │ │
▼ ▼ ▼
Provider Tools Memory
(LLM) (工具系统) (记忆系统)
│
│
▼
Skill Plugins
(YAML + Python)
│
▼
MCP Manager
│
▼
MCP Server(远程工具)
```
The Agent, as the core scheduling module, is responsible for:
* Manage conversations
* Call the model
* Trigger tools
* Process tool results
* Integrate memory
---
# Agent Workflow
A complete run of the Agent is as follows:
```
用户输入
│
▼
Channel 接收消息
│
▼
Agent 生成 messages
(system prompt + memory + history)
│
▼
调用 LLM Provider
│
▼
模型返回响应
│
├── 普通文本 → 直接返回用户
│
└── Tool Call
│
▼
ToolRegistry 查找工具
│
▼
执行 Tool.run()
│
▼
将 Tool 结果写入 session
│
▼
再次调用 LLM
```
This loop will continue to execute until:
* The model returns the final answer
* Or the maximum number of iterations is reached
---
# MCP Tool Call Flow
MCP (Model Context Protocol) is used to access remote tool servers.
Workflow:
```
Agent
│
│ 请求工具列表
▼
MCP Client
│
▼
MCP Server
│
│ 返回 tool schemas
▼
Agent ToolRegistry
│
│ 注册 MCPTool
▼
LLM 触发工具调用
│
▼
MCP Client
│
▼
远程 MCP Server
│
▼
返回工具执行结果
│
▼
Agent 继续推理
```
This design allows:
* Tool service decoupling
* Multi-Agent shared tools
* Remote capability extension
---
# Project Directory Structure
```
mini_nanobot/
│
├── core/
│ ├── agent.py # Agent 核心逻辑
│ ├── response.py # LLM 响应封装
│ ├── prompt_loader.py # Prompt 模板加载
│ └── message.py # Message / Session
│
├── channels/
│ ├── base.py # Channel 接口
│ ├── cli.py # CLI 聊天
│ └── feishu.py # 飞书机器人
│
├── providers/
│ ├── base.py # Provider 抽象类
│ └── zhipu.py # 智谱AI实现
│
├── tools/
│ ├── base.py # Tool 抽象类
│ ├── function_tool.py # Function Tool
│ ├── registry.py # Tool 注册中心
│ └── builtin/ # 内置工具
│
├── skills/
│ ├── loader.py # Skill 加载器
│ ├── gold/ # 示例 Skill
│ └── weather/ # 天气 Skill
│ ├── skill.yaml
│ └── run.py
│
├── MCP/
│ ├── client.py # MCP Client
│ ├── tool.py # MCP Tool 封装
│ └── manager.py # MCP 管理器
│
├── memory/
│ ├── base.py # Memory 抽象
│ └── store.py # 简单 Memory Store
│
├── templates/
│ └── mcp_server.py # 示例 MCP Server
│
└── main.py # 项目入口
```
---
# Quick Start
## 1 Install Dependencies
```
pip install -r requirements.txt
```
---
## 2 Configure API Key
For example, using Zhipu:
```
export ZHIPU_API_KEY=your_key
```
---
## 3 Start CLI
```
python main.py
```
Example:
```
User: 北京天气怎么样?
Agent: 我帮你查询一下...
[调用 weather 工具]
Agent: 北京今天晴天,气温 26℃。
```
---
# Extend Capabilities
## Add a New Tool
1. Inherit `Tool`
2. Implement `run()`
3. Register to `ToolRegistry`
---
## Add a New Skill
Create a directory:
```
skills/my_skill
```
Add:
```
skill.yaml
run.py
```
to load automatically.
---
## Add a New Provider
Implement:
```
providers/base.py
```
to support new model services.
---
## Access the MCP Tool Server
Add the server address in `MCP Manager` to dynamically load remote tools.
---
# Design Goals
Mini NanoBot is mainly used to showcase the **core architectural design of AI Agents**:
* Module decoupling
* Plug-in capabilities
* Tool ecosystem
* Scalability
The code should be kept as **simple, clear, and easy to understand** as possible for easy learning and extension.
---
# License
MIT License
---
Connection Info
You Might Also Like
n8n
n8n is a workflow automation platform for technical teams, combining code...
ollama
Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.
dify
Dify is a platform for AI workflows, enabling file uploads and self-hosting.
open-webui
Open WebUI is an extensible web interface for various applications.
NextChat
NextChat is a light and fast AI assistant supporting Claude, DeepSeek, GPT4...
zed
Zed is a high-performance multiplayer code editor from the creators of Atom.
Cline
Cline is a versatile tool available on VS Marketplace for enhancing...
anything-llm
AnythingLLM: An all-in-one AI app for chatting with documents and using AI agents.
cherry-studio
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
chatbox
User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)