Skip to main content

Multi-Agent Support

Zenflow can orchestrate tasks using different AI coding agents. Each agent runs in an isolated Git worktree and has access to Zencoder’s context engine, multi-repo search, Zen Rules, and Skills — regardless of which underlying model or provider powers it.

Available Agents

Zencoder CLI

Native Zencoder agent included with all plans.

Claude Code

Anthropic’s Claude Code agent.

Codex

OpenAI’s Codex agent.

Gemini

Google’s Gemini agent.

Custom Models

Connect your own model endpoints — local, VPC, or third-party — to use them alongside or instead of built-in agents.

Custom Models

Bring your own model or private endpoint

Model Flexibility

Zencoder supports models from multiple providers:
ProviderModels
AnthropicHaiku 4.5, Sonnet 4, Sonnet 4.5, Opus 4.1, Opus 4.5, Opus 4.6
OpenAIGPT-5.1-Codex, GPT-5.1-Codex-mini
GoogleGemini Pro 3.0
xAIGrok Code Fast 1
ZencoderAuto (routed mix), Auto+
Select the model per chat session in the model dropdown. Different models have different cost multipliers. See Models for details.

Custom Hosted Models

You can connect your own model endpoints — local, VPC, or third-party — using settings.json. This lets you:
  • Run open-source models locally via Ollama or vLLM
  • Use Azure OpenAI, Vertex AI, or other cloud-hosted endpoints
  • Point to private inference servers inside your network
  • Hide the default model catalog and only expose approved models
See Custom Models Configuration for the full setup guide.

Shared Capabilities

All agents running through Zenflow get access to:
FeatureDescription
Repo Info AgentCodebase indexing and understanding
Multi-Repo SearchCross-repository code search
Zen RulesProject-level coding standards
SkillsReusable prompt templates
MCP IntegrationsExternal tool access (databases, APIs, Jira, etc.)
AnalyticsUsage and productivity tracking

Selecting an Agent in Zenflow

  1. Open Settings → Default agents in Zenflow
  2. Set the default agent for new tasks