Multi-Agent Support
Zenflow can orchestrate tasks using different AI coding agents. Each agent runs in an isolated Git worktree and has access to Zencoder’s context engine, multi-repo search, Zen Rules, and Skills — regardless of which underlying model or provider powers it.Available Agents
Zencoder CLI
Native Zencoder agent included with all plans.
Claude Code
Anthropic’s Claude Code agent.
Codex
OpenAI’s Codex agent.
Gemini
Google’s Gemini agent.
Custom Models
Connect your own model endpoints — local, VPC, or third-party — to use them alongside or instead of built-in agents.Custom Models
Bring your own model or private endpoint
Model Flexibility
Zencoder supports models from multiple providers:| Provider | Models |
|---|---|
| Anthropic | Haiku 4.5, Sonnet 4, Sonnet 4.5, Opus 4.1, Opus 4.5, Opus 4.6 |
| OpenAI | GPT-5.1-Codex, GPT-5.1-Codex-mini |
| Gemini Pro 3.0 | |
| xAI | Grok Code Fast 1 |
| Zencoder | Auto (routed mix), Auto+ |
Custom Hosted Models
You can connect your own model endpoints — local, VPC, or third-party — usingsettings.json. This lets you:
- Run open-source models locally via Ollama or vLLM
- Use Azure OpenAI, Vertex AI, or other cloud-hosted endpoints
- Point to private inference servers inside your network
- Hide the default model catalog and only expose approved models
Shared Capabilities
All agents running through Zenflow get access to:| Feature | Description |
|---|---|
| Repo Info Agent | Codebase indexing and understanding |
| Multi-Repo Search | Cross-repository code search |
| Zen Rules | Project-level coding standards |
| Skills | Reusable prompt templates |
| MCP Integrations | External tool access (databases, APIs, Jira, etc.) |
| Analytics | Usage and productivity tracking |
Selecting an Agent in Zenflow
- Open Settings → Default agents in Zenflow
- Set the default agent for new tasks