Content
# Dingtalk Agent Development SDK
Supports integrating various MCP-Server to quickly develop Dingtalk AI assistants

## Overview
This project provides a powerful foundation for building AI assistants on Dingtalk, leveraging the following technologies:
- OpenAI Agent SDK provides AI capabilities and reasoning functions
- Dingtalk streaming API enables real-time message processing
- MCP-server integrates access to Dingtalk organizational data and can extend more capabilities
- Employee information query Agent example demonstrates platform capabilities
## Key Features
- **OpenAI Agent Integration**: Seamless integration with OpenAI's Agent framework
- **Dingtalk Streaming Client**: Implements robust connection retry and health monitoring mechanisms
- **Message Processing Pipeline**: Well-structured message receiving and processing system
- **MCP Tool Integration**: Implements employee query function through Dingtalk API
- **Extensible Architecture**: Easy to add new Agents or features
## Architecture
The system consists of the following key components:
- **Streaming Client Manager**: Manages WebSocket connections to the Dingtalk streaming API
- **Message Processor**: Processes incoming messages and routes them to the appropriate Agent
- **Agent Manager**: Manages OpenAI Agents and tool integrations
- **Employee Agent**: An example Agent that can query employee information
## Prerequisites
- Python 3.10+
- Dingtalk developer account and corresponding permissions
- Obtain the Client_ID and Client_Secret of the AI assistant

- Activate the call permission of the basic interface


- api-key for LLM calls (defaults to using the qwen-max service of BaiLian, configurable)
## Installation
1. Clone the repository:
```bash
git clone git@github.com:darrenyao/dingtalk-agent-client.git
cd DingtalkAgentClient
```
2. Install dependencies:
```bash
pip install -r requirements.txt
```
3. Create a `.env` file containing the configuration:
```
# Dingtalk API Configuration
DINGTALK_CLIENT_ID=your_client_id
DINGTALK_CLIENT_SECRET=your_client_secret
# LLM API Configuration
LLM_API_KEY=your_llm_api_key
LLM_API_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 # Or your custom endpoint
LLM_API_MODEL=qwen-max # Or your preferred model
```
## Running the Application
Use the following command to start the application:
```bash
python main.py
```
The system will:
1. Initialize the Agent with the necessary tools
2. Connect to Dingtalk's streaming API
3. Listen for incoming messages
4. Process messages through the appropriate Agent
## Employee Agent Example
The included `employee_agent.py` demonstrates how to:
1. Create an Agent integrated with the Dingtalk organization API
2. Use MCP-tool to query employee information
3. Build dynamic instructions based on user context
4. Process and return results in a conversational manner
## Docker Deployment
You can also run the application using Docker:
```bash
docker-compose up -d
```
## Extending the Framework
Creating a new Agent:
1. Create a new Agent file in the `app/agent` directory
2. Define the Agent's instructions and tools
3. Register the Agent with the AgentManager
4. Update message processing to route appropriate requests to your Agent
Connection Info
You Might Also Like
awesome-mcp-servers
A collection of MCP servers.
git
A Model Context Protocol server for Git automation and interaction.
Appwrite
Build like a team of hundreds
TrendRadar
TrendRadar: Your hotspot assistant for real news in just 30 seconds.
oh-my-opencode
Background agents · Curated agents like oracle, librarians, frontend...
cc-switch
All-in-One Assistant for Claude Code, Codex & Gemini CLI across platforms.