Content
# 🛍️ E-commerce Recommendation Engine FastMCP Server
A comprehensive e-commerce recommendation system built with **FastMCP (Fast Model Context Protocol)**, featuring AI-powered product search, personalized recommendations, and real-time analytics. This project combines **PostgreSQL**, **Elasticsearch**, **FAISS vector database**, and **Google Gemini LLM** to deliver intelligent e-commerce experiences.
## 🌟 Features
### 🔍 **Advanced Search Capabilities**
- **Hybrid Search**: Combines keyword search (Elasticsearch) with semantic search (FAISS)
- **Real-time Autocomplete**: Smart product suggestions as you type
- **Faceted Filtering**: Filter by category, brand, price, rating, and availability
- **Semantic Understanding**: Natural language queries with context understanding
### 🎯 **AI-Powered Recommendations**
- **Personalized Recommendations**: Based on user preferences and behavior
- **Collaborative Filtering**: "Users like you also bought" recommendations
- **Content-Based Filtering**: Similar products using vector embeddings
- **Hybrid Recommendations**: Combines multiple recommendation strategies
- **Explainable AI**: Clear explanations for why products are recommended
### 📊 **Real-time Analytics**
- **User Behavior Tracking**: Views, clicks, purchases, and interactions
- **Product Performance**: Sales metrics, conversion rates, and popularity
- **Recommendation Effectiveness**: Click-through rates and conversion tracking
- **Dashboard Visualizations**: Interactive charts and metrics
### 🔧 **FastMCP Integration**
- **Standardized Tools**: Search, recommendation, and user management tools
- **Resource Management**: Access to product catalogs, user profiles, and analytics
- **Prompt Templates**: Pre-built prompts for common e-commerce scenarios
- **AI Model Integration**: Seamless integration with language models
### 🎨 **Modern UI**
- **Streamlit Dashboard**: Interactive web interface for management and testing
- **Product Search Interface**: Advanced search with filters and sorting
- **User Management**: Create, update, and analyze user profiles
- **Recommendation Testing**: Test different recommendation algorithms
- **Analytics Visualizations**: Real-time charts and performance metrics
## 🏗️ Architecture
```mermaid
graph TB
UI[Streamlit UI] --> API[FastAPI Server]
API --> MCP[MCP Server]
API --> PG[(PostgreSQL)]
API --> ES[(Elasticsearch)]
API --> FAISS[(FAISS Vector DB)]
MCP --> TOOLS[MCP Tools]
TOOLS --> SEARCH[Search Tools]
TOOLS --> REC[Recommendation Tools]
TOOLS --> USER[User Tools]
EMB[Embedding Service] --> FAISS
EMB --> MXBAI[MixedBread AI Model]
GEMINI[Google Gemini] --> MCP
```
### **Technology Stack**
- **Backend**: FastAPI, Python 3.9+
- **Databases**: PostgreSQL, Elasticsearch, FAISS
- **AI/ML**: Google Gemini LLM, MixedBread AI Embeddings
- **Frontend**: Streamlit
- **Search**: Elasticsearch with custom analyzers
- **Vector Search**: FAISS with cosine similarity
- **API**: RESTful APIs with OpenAPI documentation
- **MCP**: Model Context Protocol for AI integration
## 📋 Prerequisites
### **System Requirements**
- Python 3.9 or higher
- PostgreSQL 12+ (running locally or accessible)
- Elasticsearch 8.0+ (running locally or accessible)
- At least 4GB RAM (8GB recommended)
- 2GB free disk space
### **API Keys Required**
- **Google Gemini API Key**: For LLM integration
- Get your key from [Google AI Studio](https://makersuite.google.com/app/apikey)
### **Services Setup**
#### **PostgreSQL Installation**
```bash
# Ubuntu/Debian
sudo apt update
sudo apt install postgresql postgresql-contrib
# macOS with Homebrew
brew install postgresql
brew services start postgresql
# Windows
# Download from https://www.postgresql.org/download/windows/
```
#### **Elasticsearch Installation**
```bash
# Using Docker (Recommended)
docker run -d \
--name elasticsearch \
-p 9200:9200 \
-p 9300:9300 \
-e "discovery.type=single-node" \
-e "xpack.security.enabled=false" \
elasticsearch:8.11.1
# Or download from https://www.elastic.co/downloads/elasticsearch
```
## 🚀 Quick Start
### **1. Clone the Repository**
```bash
git clone <repository-url>
cd ecommerce-mcp-server
```
### **2. Create Virtual Environment**
```bash
python -m venv venv
# Windows
venv\Scripts\activate
# macOS/Linux
source venv/bin/activate
```
### **3. Install Dependencies**
```bash
pip install -r requirements.txt
```
### **4. Configure Environment**
```bash
# Copy environment template
cp .env.example .env
# Edit .env with your configurations
# Required: GEMINI_API_KEY, database connections
```
### **5. Setup Database**
```bash
# Initialize database with sample data
python run.py setup-db
```
### **6. Start All Services**
```bash
# Start API server, UI, and MCP server
python run.py all
```
### **7. Access the Applications**
- **Streamlit UI**: http://localhost:8501
- **API Documentation**: http://localhost:8000/docs
- **API Health Check**: http://localhost:8000/health
## 📖 Detailed Setup
### **Environment Configuration**
Create a `.env` file with the following configuration:
```bash
# Database Configuration
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=ecommerce_mcp
POSTGRES_USER=postgres
POSTGRES_PASSWORD=your_password
# Elasticsearch Configuration
ELASTICSEARCH_HOST=localhost
ELASTICSEARCH_PORT=9200
# FAISS Configuration
FAISS_INDEX_PATH=./data/faiss_indexes/
FAISS_DIMENSION=384
# AI Configuration
GEMINI_API_KEY=your_gemini_api_key_here
EMBEDDING_MODEL=mixedbread-ai/mxbai-embed-large-v1
# API Configuration
API_HOST=0.0.0.0
API_PORT=8000
DEBUG=true
# Streamlit Configuration
STREAMLIT_PORT=8501
# Logging
LOG_LEVEL=INFO
LOG_FILE=./logs/app.log
```
### **Database Setup Options**
```bash
# Setup database with sample data
python run.py setup-db
# Reset database (WARNING: Deletes all data)
python run.py setup-db --reset-db
# Check database status
python run.py status
```
### **Running Individual Services**
```bash
# Run only the API server
python run.py api
# Run only the Streamlit UI
python run.py ui
# Run only the MCP server
python run.py mcp
# Run all services together
python run.py all
# Check system health
python run.py health
```
## 🔧 Usage Guide
### **MCP Tools**
The system provides several MCP tools for AI integration:
#### **Search Tools**
```python
# Search products with natural language
search_products(query="smartphones with good cameras", limit=10)
# Find similar products
search_similar_products(product_id=123, limit=5)
# Get autocomplete suggestions
get_product_suggestions(query="iphone", limit=10)
```
#### **Recommendation Tools**
```python
# Get personalized recommendations
get_personalized_recommendations(user_id=123, num_recommendations=10)
# Get collaborative recommendations
get_collaborative_recommendations(user_id=123, num_recommendations=5)
# Get trending products
get_trending_products(num_recommendations=10)
# Explain why a product is recommended
explain_recommendation(user_id=123, product_id=456)
```
#### **User Tools**
```python
# Get user profile
get_user_profile(user_id=123)
# Analyze user preferences
analyze_user_preferences(user_id=123)
# Get user purchase history
get_user_purchase_history(user_id=123, limit=20)
# Update user preferences
update_user_preferences(user_id=123, preferences={...})
```
### **API Endpoints**
#### **Products**
- `GET /products/` - List products with filtering
- `POST /products/` - Create new product
- `GET /products/{id}` - Get specific product
- `PUT /products/{id}` - Update product
- `POST /products/search` - Advanced product search
- `GET /products/{id}/similar` - Get similar products
#### **Users**
- `GET /users/` - List users
- `POST /users/` - Create new user
- `GET /users/{id}` - Get user profile
- `PUT /users/{id}` - Update user
- `GET /users/{id}/analytics` - User analytics
- `POST /users/{id}/purchases` - Add purchase
- `GET /users/{id}/interactions` - User interactions
#### **Recommendations**
- `POST /recommendations/` - Get recommendations
- `GET /recommendations/personalized/{user_id}` - Personalized recommendations
- `GET /recommendations/similar/{product_id}` - Similar products
- `GET /recommendations/trending` - Trending products
- `GET /recommendations/explain/{user_id}/{product_id}` - Explanation
### **Streamlit UI Features**
#### **Dashboard**
- System metrics and KPIs
- Sales trends and analytics
- Recent activity feed
- Service health status
#### **Product Search**
- Advanced search with filters
- Hybrid search (keyword + semantic)
- Product comparison
- Search analytics
#### **User Management**
- User profiles and preferences
- Purchase history analysis
- User behavior analytics
- Create and update users
#### **Recommendations**
- Test different recommendation algorithms
- View recommendation explanations
- Compare recommendation performance
- A/B testing interface
#### **Analytics**
- Sales and revenue analytics
- User behavior patterns
- Product performance metrics
- Recommendation effectiveness
## 🧪 Testing
### **Run Tests**
```bash
# Run all tests
pytest
# Run with coverage
pytest --cov=.
# Run specific test category
pytest tests/test_api.py
pytest tests/test_mcp.py
pytest tests/test_recommendations.py
```
### **Manual Testing**
#### **API Testing**
```bash
# Test health endpoint
curl http://localhost:8000/health
# Test product search
curl -X POST http://localhost:8000/products/search \
-H "Content-Type: application/json" \
-d '{"query": "smartphone", "limit": 5}'
# Test recommendations
curl http://localhost:8000/recommendations/personalized/1?num_recommendations=5
```
#### **MCP Testing**
Use the Streamlit UI's MCP Tools section to test individual MCP tools and their responses.
## 📊 Data Models
### **Product Model**
```python
{
"id": 1,
"name": "iPhone 14 Pro",
"description": "Latest Apple smartphone...",
"price": 999.99,
"category": "Smartphones",
"brand": "Apple",
"rating": 4.8,
"attributes": {"color": "Space Black", "storage": "128GB"},
"tags": ["smartphone", "apple", "camera"]
}
```
### **User Model**
```python
{
"id": 1,
"email": "user@example.com",
"name": "John Doe",
"age": 28,
"preferences": {
"preferred_categories": ["Electronics", "Books"],
"price_range": "mid-range"
}
}
```
### **Recommendation Model**
```python
{
"product_id": 123,
"name": "Product Name",
"price": 99.99,
"recommendation_score": 0.85,
"reason": "Based on your purchase history"
}
```
## 🔧 Configuration
### **Embedding Configuration**
- **Model**: mixedbread-ai/mxbai-embed-large-v1 (384 dimensions)
- **Batch Size**: 32 texts per batch
- **Normalization**: L2 normalization for cosine similarity
### **Search Configuration**
- **Elasticsearch**: Custom analyzers for product search
- **FAISS**: IndexFlatIP for cosine similarity
- **Hybrid Search**: 60% semantic, 40% keyword by default
### **Recommendation Configuration**
- **Personalized**: User preferences + purchase history + interactions
- **Collaborative**: User-item matrix with similarity scoring
- **Content-based**: Product feature vectors with FAISS
- **Hybrid**: Weighted combination of multiple approaches
## 🚀 Deployment
### **Production Deployment**
#### **Environment Setup**
```bash
# Set production environment variables
export DEBUG=false
export LOG_LEVEL=WARNING
export API_HOST=0.0.0.0
export API_PORT=8000
```
#### **Database Migration**
```bash
# Run database migrations
python run.py setup-db
# Create database backup
pg_dump ecommerce_mcp > backup.sql
```
#### **Service Deployment**
```bash
# Start production services
python run.py all
# Or use process manager like supervisor
```
### **Scaling Considerations**
- **Database**: Use read replicas for heavy read workloads
- **Elasticsearch**: Set up cluster for high availability
- **FAISS**: Consider distributed FAISS for large catalogs
- **API**: Use load balancer for multiple API instances
- **Caching**: Implement Redis for frequent queries
## 🤝 Contributing
### **Development Setup**
```bash
# Install development dependencies
pip install -r requirements.txt
# Install pre-commit hooks
pre-commit install
# Run code formatting
black .
flake8 .
```
### **Adding New Features**
#### **Adding New MCP Tools**
1. Create tool in `mcp_server/tools/`
2. Add tool handler in `mcp_server/handlers/`
3. Register tool in `mcp_server/server.py`
4. Add tests in `tests/`
#### **Adding New API Endpoints**
1. Create router in `api/routers/`
2. Add schemas in `api/models/`
3. Include router in `api/main.py`
4. Update documentation
#### **Adding New Recommendation Algorithms**
1. Implement in `services/recommendation_service.py`
2. Add MCP tool wrapper
3. Add UI testing interface
4. Add performance tests
## 📝 API Documentation
### **OpenAPI Documentation**
- **Swagger UI**: http://localhost:8000/docs
- **ReDoc**: http://localhost:8000/redoc
- **OpenAPI JSON**: http://localhost:8000/openapi.json
### **MCP Documentation**
- **Available Tools**: Use `/mcp/tools` endpoint
- **Server Info**: Use `/mcp/info` endpoint
- **Resource List**: Use MCP client to list resources
## 🐛 Troubleshooting
### **Common Issues**
#### **Database Connection Issues**
```bash
# Check PostgreSQL service
sudo systemctl status postgresql
# Test connection
psql -h localhost -U postgres -d ecommerce_mcp
```
#### **Elasticsearch Issues**
```bash
# Check Elasticsearch status
curl http://localhost:9200/_cluster/health
# Check indices
curl http://localhost:9200/_cat/indices
```
#### **FAISS Index Issues**
```bash
# Check FAISS index files
ls -la ./data/faiss_indexes/
# Rebuild indices
python -c "from services.search_service import SearchService; import asyncio; asyncio.run(SearchService().reindex_products())"
```
#### **Memory Issues**
- Reduce `EMBEDDING_BATCH_SIZE` in .env
- Increase system RAM or swap
- Use smaller embedding model
#### **Performance Issues**
- Check database query performance
- Monitor Elasticsearch query times
- Profile FAISS search operations
- Enable caching for frequent queries
### **Logs and Debugging**
```bash
# View application logs
tail -f ./logs/app.log
# Enable debug mode
export DEBUG=true
export LOG_LEVEL=DEBUG
# Check service health
python run.py health
```
## 📄 License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- **Model Context Protocol (MCP)** - For the standardized AI integration framework
- **Anthropic** - For MCP development and best practices
- **Google Gemini** - For the powerful language model integration
- **MixedBread AI** - For the excellent embedding model
- **Elasticsearch** - For full-text search capabilities
- **FAISS** - For efficient vector similarity search
- **FastAPI** - For the modern, fast web framework
- **Streamlit** - For the interactive UI framework
## 📞 Support
For questions, issues, or contributions:
1. **GitHub Issues**: Open an issue for bugs or feature requests
2. **Documentation**: Check this README and inline documentation
3. **Community**: Join discussions in the repository
4. **API Docs**: Use the built-in API documentation at `/docs`
---
**Happy coding! 🚀**
Connection Info
You Might Also Like
daydreams
Daydreams is a set of tools for building agents for commerce
mcp-server-airbnb
Search Airbnb using your AI Agent
nuwax
Nuwax AI - Easily build and deploy your private Agentic AI solutions. A...
cupertino
A local Apple Documentation crawler and MCP server. Written in Swift.
easy-code-reader
A powerful MCP (Model Context Protocol) server for intelligently reading...
strudel-mcp-server
A Model Context Protocol (MCP) server that gives Claude direct control over...