The open-source coding agent you control
Every token is yours. No injected prompts, tools or telemetry. Batteries included.
$
cargo install aether Get up and running in ~5 minutes
From install to running agent — everything is plain config files and markdown.
1. Install
Install the CLI from crates.io.
$ cargo install aether-agent-cli2. Create your 1st agent
Scaffold config files, tools, and a default agent prompt.
$ aether agent new✓ Created .aether/settings.json✓ Created .aether/mcp.json✓ Created .aether/SYSTEM.md✓ Created AGENTS.md3. Run it your way: TUI, IDE/editor or headless
Launch the interactive TUI, run headless for scripts and CI, or expose over ACP for editor integration.
$ aether$ aether acp$ aether headless "refactor auth"Use any LLM, cloud or local
Plus, you can bring your own.
Cloud
Anthropic, OpenAI, OpenRouter, DeepSeek, Moonshot, Zai, LlamaCpp and more are supported out of the box.
{ "agents": [ { "name": "Plan", "model": "codex:gpt-5.4", "reasoningEffort": "high" }, { "name": "Build", "model": "zai:glm-5.1" }, { "name": "Fast", "model": "openrouter:minimax/minimax-m2.7" } ]}Local
Run local models with llama.cpp or Ollama.
{ "agents": [ ... { "name": "Local", "model": "llamacpp:unsloth/Qwen3.5-9B-GGUF" } ]}Combine models by alloying them together
Combine the strengths of multiple models by round-robining between them each conversation turn. You can even mix cloud and local models together.
{ "agents": [ ... { "name": "Alloy", "model": "zai:glm-5.1,llamacpp:unsloth/Qwen3.5-9B-GGUF" } ]}Control every token in your agent’s prompt
Start with a blank slate and add only what you need.
Start with a blank slate
Agents begin with no system prompt or tools.
$ aether show-prompt -a Build
---Prompt chars: 0Tool schema chars: 0Est. tokens: ~ 0MCP tools: 0Use any markdown files you want
Add a prompts array to pull in markdown files — system instructions, agent guides, whatever you want.
{ "agents": [ { "name": "Build", "model": "zai:glm-5.1", "reasoningEffort": "high", "prompts": ["SYSTEM.md", "AGENTS.md"] } ]}# System PromptYou are an expert senior Rust engineer....# AgentsUse the Explorer agent to search code....View the fully rendered system prompt with a single command
So you don't waste tokens.
$ aether show-prompt -a Build# System PromptYou are an expert senior Rust engineer....
# AgentsUse the Explorer agent to search code....
---Prompt chars: 120Tool schema chars: 0Est. tokens: ~ 30MCP tools: 0Connect any tools you want (via MCP)
Aether agents get tools exclusively via MCP. Add what you need, nothing more.
Batteries included MCPs
Aether includes several built-in MCP servers you can optionally enable. They run in a custom in-memory transport.
coding— file I/O, grep, bash, and web fetchlsp— language-aware completions and diagnosticsskills— reusable domain knowledge as slash commandssubagents— spawn parallel child agents with their own models and toolstasks— persistent task tracking across turnssurvey— allow the agent to ask you questions with structured outputproxy— prevent tool bloat via progressive discovery
{ "agents": [ { "name": "Build", "model": "zai:glm-5.1", "prompts": ["SYSTEM.md", "AGENTS.md"], "mcpServers": "mcp.json" } ]}{ "servers": { "coding": { "type": "in-memory" }, "lsp": { "type": "in-memory" }, "skills": { "type": "in-memory" }, "subagents": { "type": "in-memory" }, "tasks": { "type": "in-memory" }, "survey": { "type": "in-memory" }, "proxy": { "type": "in-memory", "input": { ... } } }}Any MCP server
Plug in any external MCP server via stdio, SSE, or streamable HTTP. Remote server OAuth credentials are stored securely in your keychain.
{ "servers": { ... "linear": { "type": "http", "url": "https://mcp.linear.app/mcp" } }}Ready to build?
Use the full CLI or pick individual crates — aether-core, llm, mcp-servers, wisp, crucible — to compose your own agent.