Content
# MCP Aggregator
A universal aggregator for combining multiple MCP (Model Context Protocol) servers into a single endpoint for AI assistants to access various tools and capabilities.
## Features
- 🔍 **Auto-Discovery** - Automatically finds installed MCP servers
- 🚀 **Zero Configuration** - Works out of the box with sensible defaults
- 🔧 **Highly Configurable** - Customize every aspect when needed
- 🏗️ **Multi-Server Support** - Aggregate multiple MCP servers
- 🔄 **Hot Reload** - Add/remove servers without restart
- 📊 **Health Monitoring** - Automatic health checks and recovery
- 🔒 **Secure by Default** - No hardcoded secrets or paths
## Quick Start
```bash
# Install globally
npm install -g @mcp/aggregator
# Run setup
mcp-aggregator setup
# Start the aggregator
mcp-aggregator start
```
## Installation
### From npm
```bash
npm install -g @mcp/aggregator
```
### From Source
```bash
git clone https://github.com/dwillitzer/mcp-aggregator.git
cd mcp-aggregator
npm install
npm run build
npm link
```
### Using Docker
```bash
docker pull ghcr.io/dwillitzer/mcp-aggregator:latest
docker run -d -p 3000:3000 ghcr.io/dwillitzer/mcp-aggregator:latest
```
## Configuration
The aggregator uses a flexible configuration system:
1. **Environment Variables** - Override any setting
2. **Config File** - `~/.mcp/aggregator/config.json`
3. **Auto-Discovery** - Finds servers automatically
4. **Defaults** - Sensible defaults for everything
### Basic Configuration
```json
{
"aggregator": {
"port": 3000,
"host": "localhost"
},
"servers": {
"filesystem": {
"enabled": true,
"command": "mcp-server-filesystem"
},
"shell": {
"enabled": true,
"command": "mcp-server-shell"
}
}
}
```
## Available MCP Servers
The aggregator can work with any MCP-compliant server. Common servers include:
- **filesystem** - File operations (read, write, list)
- **shell** - Command execution
- **git** - Version control operations
- **browser** - Web automation
- **slack** - Slack integration
- **github** - GitHub operations
- **1password** - Secret management
## Usage with Claude
### Claude Desktop
Add to your Claude Desktop configuration:
```json
{
"mcpServers": {
"aggregator": {
"command": "mcp-aggregator",
"args": ["start", "--stdio"]
}
}
}
```
### Claude Guard
The aggregator integrates seamlessly with Claude Guard:
```bash
# Auto-detected if installed
claude-guard --mcp-status
# Or specify custom path
claude-guard --mcp-aggregator /path/to/aggregator
```
## Documentation
- [Getting Started](docs/getting-started.md)
- [Configuration Guide](docs/configuration-guide.md)
- [Server Integration](docs/server-integration.md)
- [API Reference](docs/api-reference.md)
- [Deployment Guide](docs/deployment-guide.md)
- [Troubleshooting](docs/troubleshooting.md)
## Development
```bash
# Clone repository
git clone https://github.com/dwillitzer/mcp-aggregator.git
cd mcp-aggregator
# Install dependencies
npm install
# Run in development mode
npm run dev
# Run tests
npm test
# Build for production
npm run build
```
## Contributing
Contributions are welcome! Please read our [Contributing Guide](CONTRIBUTING.md) for details on our code of conduct and the process for submitting pull requests.
## License
This software is proprietary to Daniel Willitzer with specific usage permissions.
**Key points:**
- ✅ Free to use, modify, and distribute
- ✅ Commercial use allowed
- ⚠️ Derivative works MUST declare: "This is a derivative work based on MCP Aggregator by Daniel Willitzer"
- ⚠️ Must indicate modifications made
See [LICENSE](LICENSE) for full details.
## Security
For security issues, please email daniel@willitzer.com instead of using the issue tracker.
Connection Info
You Might Also Like
everything-claude-code
Complete Claude Code configuration collection - agents, skills, hooks,...
markitdown
MarkItDown-MCP is a lightweight server for converting URIs to Markdown.
servers
Model Context Protocol Servers
servers
Model Context Protocol Servers
Time
A Model Context Protocol server for time and timezone conversions.
Filesystem
Node.js MCP Server for filesystem operations with dynamic access control.