🚀 An open-source AI-native operating system to define, organize, and run intelligent agent teams
English | 简体中文
graph TB
subgraph Access["Entry Layer"]
direction TB
Web["🌐 Web"]
IM["💬 IM Tools"]
API["🔌 API"]
end
subgraph Features["Feature Layer"]
direction TB
Chat["💬 Chat"]
Code["💻 Coding"]
Feed["📡 Feed"]
Knowledge["📚 Knowledge"]
end
subgraph Agents["Agent Layer"]
direction TB
ChatShell["🗣️ Wegent Chat"]
ClaudeCode["🧠 Claude Code"]
Agno["🤝 Agno"]
Dify["✨ Dify"]
end
subgraph Execution["Execution Environment"]
direction TB
Docker["🐳 Agent Sandbox"]
Cloud["☁️ Cloud Device"]
Local["💻 Local Device"]
end
Access --> Features
Features --> Agents
Agents --> Execution
A fully open-source chat agent with powerful capabilities:
- Multi-Model Support: Compatible with Claude, OpenAI, Gemini, DeepSeek, GLM and other mainstream models
- Conversation History: Create new conversations, multi-turn dialogues, save and share chat history
- Group Chat: AI group chat where AI responds based on conversation context with @mentions
- Attachment Parsing: Send txt, pdf, ppt, doc, images and other file formats in single/group chats
- Follow-up Mode: AI asks clarifying questions to help refine your requirements
- Error Correction Mode: Multiple AI models automatically detect and correct response errors
- Long-term Memory: Supports mem0 integration for conversation memory persistence
- Sandbox Execution: Execute commands or modify files via sandbox, E2B protocol compatible
- Extensions: Customize prompts, MCP tools and Skills (includes chart drawing skill)
A cloud-based Claude Code execution engine:
- Multi-Model Configuration: Configure various Claude-compatible models
- Concurrent Execution: Run multiple coding tasks simultaneously in the cloud
- Requirement Clarification: AI analyzes code and asks questions to generate specification documents
- Git Integration: Integrate with GitHub/GitLab/Gitea/Gerrit to clone, modify and create PRs
- MCP/Skill Support: Configure MCP tools and Skills for agents
- Multi-turn Conversations: Continue conversations with follow-up questions
A cloud-based AI task trigger system:
- Full Capability Access: Tasks can use all Chat and Code mode capabilities
- Scheduled/Event Triggers: Set up cron schedules or event-based AI task execution
- Information Feed: Display AI-generated content as an information stream
- Event Filtering: Filter conditions like "only notify me if it will rain tomorrow"
A cloud-based AI document repository:
- Document Management: Upload and manage txt/doc/ppt/xls and other document formats
- Web Import: Import web pages and DingTalk multi-dimensional tables
- NotebookLM Mode: Select documents directly in notebooks for Q&A
- Online Editing: Edit text files directly in notebook mode
- Chat Integration: Reference knowledge bases in single/group chats for AI responses
Run AI tasks on your local machine with full control:
- Local Executor: Install and run the Wegent executor on your own device
- Multi-Device Management: Register and manage multiple local devices
- Default Device: Set a preferred device for quick task execution
- Secure Connection: Connect to Wegent backend via authenticated WebSocket
Integrate AI agents into your favorite IM tools:
- DingTalk Bot: Deploy agents as DingTalk bots for team collaboration
- Telegram Bot: Connect agents to Telegram for personal or group chats
All features above are fully customizable:
- Custom Agents: Create custom agents in the web UI, configure prompts, MCP, Skills and multi-agent collaboration
- Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
- Organization Management: Create and join groups, share agents, models, Skills within groups
- Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
- Collaboration Modes: 4 out-of-the-box multi-Agent collaboration modes (Sequential/Parallel/Router/Loop)
- Skill Support: Dynamically load skill packages to improve Token efficiency
- MCP Tools: Model Context Protocol for calling external tools and services
- Execution Engines: ClaudeCode / Agno sandboxed isolation, Dify API proxy, Chat direct mode
- YAML Config: Kubernetes-style CRD for defining Ghost / Bot / Team / Skill
- API: OpenAI-compatible interface for easy integration with other systems
curl -fsSL https://raw.githubusercontent.com/wecode-ai/Wegent/main/install.sh | bashThen open http://localhost:3000 in your browser.
Optional: Enable RAG features with
docker compose --profile rag up -d
If you have cloned the source code, you can run the install script directly. It will automatically detect the source environment and build images from local source:
git clone https://github.com/wecode-ai/Wegent.git
cd Wegent
./install.shOr manually specify to use local build in the source directory:
docker compose -f docker-compose.yml -f docker-compose.build.yml up -dCommon commands (source mode):
# View logs
docker compose -f docker-compose.yml -f docker-compose.build.yml logs -f
# Stop services
docker compose -f docker-compose.yml -f docker-compose.build.yml down
# Start services
docker compose -f docker-compose.yml -f docker-compose.build.yml up -d
# Rebuild images
docker compose -f docker-compose.yml -f docker-compose.build.yml build --no-cacheIf you are a developer and want fast debugging with hot reload, use local development mode:
git clone https://github.com/wecode-ai/Wegent.git
cd Wegent
./start.shLocal development mode features:
- Run services directly without Docker
- Automatic hot reload on code changes
- Ideal for daily development and debugging
Common commands (local dev mode):
# Start services
./start.sh
# Stop services
./start.sh --stop
# Restart services
./start.sh --restart
# Check status
./start.sh --status
# Initialize configuration
./start.sh --init
# Show help
./start.sh --help| Team | Purpose |
|---|---|
| chat-team | General AI assistant + Mermaid diagrams |
| translator | Multi-language translation |
| dev-team | Git workflow: branch → code → commit → PR |
| wiki-team | Codebase Wiki documentation generation |
We welcome contributions! Please see our Contributing Guide for details.
- 🐛 Issues: GitHub Issues
- 💬 Discord: Join our community
Thanks to the following developers for their contributions and efforts to make this project better. 💪
Made with ❤️ by WeCode-AI Team