Skip to content

wecode-ai/Wegent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1,160 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Wegent

🚀 An open-source AI-native operating system to define, organize, and run intelligent agent teams

English | 简体中文

Python FastAPI Next.js Docker Claude Gemini Version


🏗️ Architecture Overview

graph TB
    subgraph Access["Entry Layer"]
        direction TB
        Web["🌐 Web"]
        IM["💬 IM Tools"]
        API["🔌 API"]
    end

    subgraph Features["Feature Layer"]
        direction TB
        Chat["💬 Chat"]
        Code["💻 Coding"]
        Feed["📡 Feed"]
        Knowledge["📚 Knowledge"]
    end

    subgraph Agents["Agent Layer"]
        direction TB
        ChatShell["🗣️ Wegent Chat"]
        ClaudeCode["🧠 Claude Code"]
        Agno["🤝 Agno"]
        Dify["✨ Dify"]
    end

    subgraph Execution["Execution Environment"]
        direction TB
        Docker["🐳 Agent Sandbox"]
        Cloud["☁️ Cloud Device"]
        Local["💻 Local Device"]
    end

    Access --> Features
    Features --> Agents
    Agents --> Execution
Loading

✨ Core Features

💬 Chat Agent

Chat Mode Demo A fully open-source chat agent with powerful capabilities:
  • Multi-Model Support: Compatible with Claude, OpenAI, Gemini, DeepSeek, GLM and other mainstream models
  • Conversation History: Create new conversations, multi-turn dialogues, save and share chat history
  • Group Chat: AI group chat where AI responds based on conversation context with @mentions
  • Attachment Parsing: Send txt, pdf, ppt, doc, images and other file formats in single/group chats
  • Follow-up Mode: AI asks clarifying questions to help refine your requirements
  • Error Correction Mode: Multiple AI models automatically detect and correct response errors
  • Long-term Memory: Supports mem0 integration for conversation memory persistence
  • Sandbox Execution: Execute commands or modify files via sandbox, E2B protocol compatible
  • Extensions: Customize prompts, MCP tools and Skills (includes chart drawing skill)

💻 Code Agent

Code Mode Demo

A cloud-based Claude Code execution engine:

  • Multi-Model Configuration: Configure various Claude-compatible models
  • Concurrent Execution: Run multiple coding tasks simultaneously in the cloud
  • Requirement Clarification: AI analyzes code and asks questions to generate specification documents
  • Git Integration: Integrate with GitHub/GitLab/Gitea/Gerrit to clone, modify and create PRs
  • MCP/Skill Support: Configure MCP tools and Skills for agents
  • Multi-turn Conversations: Continue conversations with follow-up questions

📡 AI Feed

Feed Demo

A cloud-based AI task trigger system:

  • Full Capability Access: Tasks can use all Chat and Code mode capabilities
  • Scheduled/Event Triggers: Set up cron schedules or event-based AI task execution
  • Information Feed: Display AI-generated content as an information stream
  • Event Filtering: Filter conditions like "only notify me if it will rain tomorrow"

📚 AI Knowledge

Knowledge Demo

A cloud-based AI document repository:

  • Document Management: Upload and manage txt/doc/ppt/xls and other document formats
  • Web Import: Import web pages and DingTalk multi-dimensional tables
  • NotebookLM Mode: Select documents directly in notebooks for Q&A
  • Online Editing: Edit text files directly in notebook mode
  • Chat Integration: Reference knowledge bases in single/group chats for AI responses

🖥️ AI Device

AI Device Demo

Run AI tasks on your local machine with full control:

  • Local Executor: Install and run the Wegent executor on your own device
  • Multi-Device Management: Register and manage multiple local devices
  • Default Device: Set a preferred device for quick task execution
  • Secure Connection: Connect to Wegent backend via authenticated WebSocket

💬 IM Integration

Integrate AI agents into your favorite IM tools:

  • DingTalk Bot: Deploy agents as DingTalk bots for team collaboration
  • Telegram Bot: Connect agents to Telegram for personal or group chats

🔧 Customization

All features above are fully customizable:

  • Custom Agents: Create custom agents in the web UI, configure prompts, MCP, Skills and multi-agent collaboration
  • Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
  • Organization Management: Create and join groups, share agents, models, Skills within groups

🔧 Extensibility

  • Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
  • Collaboration Modes: 4 out-of-the-box multi-Agent collaboration modes (Sequential/Parallel/Router/Loop)
  • Skill Support: Dynamically load skill packages to improve Token efficiency
  • MCP Tools: Model Context Protocol for calling external tools and services
  • Execution Engines: ClaudeCode / Agno sandboxed isolation, Dify API proxy, Chat direct mode
  • YAML Config: Kubernetes-style CRD for defining Ghost / Bot / Team / Skill
  • API: OpenAI-compatible interface for easy integration with other systems

🚀 Quick Start

Method 1: Quick Install (Recommended)

curl -fsSL https://raw.githubusercontent.com/wecode-ai/Wegent/main/install.sh | bash

Then open http://localhost:3000 in your browser.

Optional: Enable RAG features with docker compose --profile rag up -d

Method 2: Source Installation

If you have cloned the source code, you can run the install script directly. It will automatically detect the source environment and build images from local source:

git clone https://github.com/wecode-ai/Wegent.git
cd Wegent
./install.sh

Or manually specify to use local build in the source directory:

docker compose -f docker-compose.yml -f docker-compose.build.yml up -d

Common commands (source mode):

# View logs
docker compose -f docker-compose.yml -f docker-compose.build.yml logs -f

# Stop services
docker compose -f docker-compose.yml -f docker-compose.build.yml down

# Start services
docker compose -f docker-compose.yml -f docker-compose.build.yml up -d

# Rebuild images
docker compose -f docker-compose.yml -f docker-compose.build.yml build --no-cache

Method 3: Local Development Mode

If you are a developer and want fast debugging with hot reload, use local development mode:

git clone https://github.com/wecode-ai/Wegent.git
cd Wegent
./start.sh

Local development mode features:

  • Run services directly without Docker
  • Automatic hot reload on code changes
  • Ideal for daily development and debugging

Common commands (local dev mode):

# Start services
./start.sh

# Stop services
./start.sh --stop

# Restart services
./start.sh --restart

# Check status
./start.sh --status

# Initialize configuration
./start.sh --init

# Show help
./start.sh --help

📦 Built-in Agents

Team Purpose
chat-team General AI assistant + Mermaid diagrams
translator Multi-language translation
dev-team Git workflow: branch → code → commit → PR
wiki-team Codebase Wiki documentation generation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

📞 Support

👥 Contributors

Thanks to the following developers for their contributions and efforts to make this project better. 💪

qdaxb
Axb
feifei325
Feifei
Micro66
MicroLee
cc-yafei
YaFei Liu
FicoHu
FicoHu
johnny0120
Johnny0120
yixiangxx
Yi Xiang
kissghosts
Yanhe
joyway1978
Joyway78
moqimoqidea
Moqimoqidea
2561056571
Xuemin
parabala
Parabala
icycrystal4
Icycrystal4
maquan0927
Just Quan
kerwin612
Kerwin Bryant
junbaor
Junbaor
fingki
Fingki
fengkuizhi
Fengkuizhi
jolestar
Jolestar
qwertyerge
Erdawang
sunnights
Jake Zhang
DeadLion
Jasper Zhong
LiDaiyan
Li Daiyan
RichardoMrMu
RichardoMu
andrewzq777
Andrewzq777
graindt
Graindt
qingchengliu
Qingcheng
salt-hai
Salt-hai

Made with ❤️ by WeCode-AI Team