Content
# mcp-chatbot
A modular, async research assistant that combines **Anthropic Claude 3** with the **Model Context Protocol (MCP)**, delivering on‐demand literature search and summarisation for academics and engineers.
---
## 1. Project Structure
```bash
mcp-chatbot/
├── Dockerfile
├── pyproject.toml
├── uv.lock
├── README.md
├── server_config.json
├── research_server.py
├── papers/ # Cached paper metadata by topic
├── mcp_chatbot/
│ ├── __init__.py
│ ├── cli.py # Typer-based CLI
│ └── core.py # Main chatbot engine
└── tests/
└── test_core.py
```
## 2. Quick Start
### 2.1. Clone the Repository
```bash
git clone https://github.com/mctrinh/mcp-chatbot.git
cd mcp-chatbot
```
### 2.2. Install Dependencies
#### Install `uv` (recommended)
```bash
# Git Bash or WSL on Windows, doesn't work in standard Command Prompt or PowerShell
curl -LsSf https://astral.sh/uv/install.sh | sh
# Scoop (Windows)
scoop install uv
# Chocolatey (Windows - Administrator Command Prompt - Recommended)
choco install uv
uv --version
```
#### Install Python packages in `project.dependencies` in `pyproject.toml`
```bash
pip install -e .
```
## 3. Build and Run with Docker
```bash
# Build image
docker build -t mcp-chatbot:0.1 .
# Run server and CLI (ports 8001 and 8000)
docker run --rm -it -p 8001:8001 -p 8000:8000 mcp-chatbot:0.1
```
## 4. Run Without Docker (Local Dev)
```bash
# Install dependencies
uv pip install -e .[dev]
# Start the research server (MCP tool)
python research_server.py
# In a new terminal, launch the chatbot CLI
mcp-chatbot run
```
## 5. Try the Chatbot
### 5.1. REPL Mode
```bash
python -m mcp_chatbot.cli run
```
Or using the installed script:
```bash
mcp-chatbot run
```
Once inside the REPL (Read-Eval-Print Loop), you can interact with the chatbot directly by typing commands or queries. Example commands:
```bash
/prompts # list Claude prompts
@folders # list downloaded paper topics
AI alignment # ask anything – Claude decide whether to invoke tools
```
### 5.2. One-shot Query
```bash
mcp-chatbot once "What are the latest trends in diffusion models?"
```
## 6. Configuration (Optional)
Environment variables and `server_config.json` control model and ports:
```bash
export ANTHROPIC_MODEL="claude-3-opus-20240229"
export RESEARCH_PORT=8001
export PAPER_DIR=./papers
```
## 7. Testing
```bash
# Installs pytest, coverage, etc.
uv pip install -e .[dev]
# Run unit tests
pytest -q
# With coverage (optional)
pytest --cov=mcp_chatbot
```
## 8. Road map
- Research MCP server with `search_papers` and `extract_info` (done)
- Tool usage via Claude 3 (done)
- Prompt orchestration (done)
- Vector search over stored papers (Faiss / Chroma)
- Web UI using FastAPI + React
- GitHub Actions for CI/CD
## 9. License
MIT License. Copyright © 2025.
## 10. Current Issues
Issues occur when running ```mcp-chatbot run```
- <span style="color:red;">⚠ Could not connect to server 'fetch': Method not found</span>
- <span style="color:red;">⚠ Could not connect to server 'filesystem': Method not found</span>
Connection Info
You Might Also Like
awesome-mcp-servers
A collection of MCP servers.
git
A Model Context Protocol server for Git automation and interaction.
Appwrite
Build like a team of hundreds
TrendRadar
TrendRadar: Your hotspot assistant for real news in just 30 seconds.
chatbox
User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)
continue
Continue is an open-source project for seamless server management.