Skip to content

Configuration Overview

Ash uses TOML configuration files to define models, providers, and behavior.

Configuration File Location

Ash looks for configuration in this order:

  1. ./config.toml (current directory)
  2. ~/.ash/config.toml (user home)
  3. /etc/ash/config.toml (system-wide)

Run ash init to create your configuration file.

Full Example Configuration

# Workspace directory for SOUL.md and USER.md
workspace = "~/.ash/workspace"
# LLM Provider API Keys
[anthropic]
api_key = "sk-ant-..." # Or use ANTHROPIC_API_KEY env var
[openai]
api_key = "sk-..." # Or use OPENAI_API_KEY env var
# Named model configurations (Haiku for fast/cheap, Sonnet for complex tasks)
[models.default]
provider = "anthropic"
model = "claude-haiku-4-5"
temperature = 0.7
max_tokens = 4096
[models.sonnet]
provider = "anthropic"
model = "claude-sonnet-4-5"
max_tokens = 8192
# Telegram bot integration
[telegram]
bot_token = "123456:ABC..." # Or use TELEGRAM_BOT_TOKEN env var
allowed_users = ["@yourusername", "123456789"]
allowed_groups = []
group_mode = "mention"
# Docker sandbox settings
[sandbox]
image = "ash-sandbox:latest"
timeout = 60
memory_limit = "512m"
cpu_limit = 1.0
runtime = "runc"
network_mode = "bridge"
workspace_access = "rw"
# HTTP server settings
[server]
host = "127.0.0.1"
port = 8080
webhook_path = "/webhook"
# Memory and context settings
[memory]
database_path = "~/.ash/memory.db"
max_context_messages = 20
context_token_budget = 100000
recency_window = 10
system_prompt_buffer = 8000
compaction_enabled = true
extraction_enabled = true
# Session management
[sessions]
mode = "persistent" # "persistent" or "fresh"
max_concurrent = 2 # Parallel session processing limit
# Agent-specific overrides
[agents.research]
model = "sonnet"
max_iterations = 50
# Embeddings for semantic search
[embeddings]
provider = "openai"
model = "text-embedding-3-small"
# Web search integration
[brave_search]
api_key = "..." # Or use BRAVE_SEARCH_API_KEY env var
# Error tracking (optional)
[sentry]
dsn = "https://..." # Or use SENTRY_DSN env var
environment = "production"
traces_sample_rate = 0.1

Environment Variables

API keys can be set via environment variables instead of the config file:

VariableConfig Path
ANTHROPIC_API_KEY[anthropic].api_key
OPENAI_API_KEY[openai].api_key
TELEGRAM_BOT_TOKEN[telegram].bot_token
BRAVE_SEARCH_API_KEY[brave_search].api_key
SENTRY_DSN[sentry].dsn

Environment variables take precedence over config file values.

Minimal Configuration

Ash requires both Anthropic (for the LLM) and OpenAI (for memory embeddings) API keys. The minimal config:

[anthropic]
api_key = "sk-ant-..."
[openai]
api_key = "sk-..."
[models.default]
provider = "anthropic"
model = "claude-haiku-4-5"
[embeddings]
provider = "openai"
model = "text-embedding-3-small"

For a better experience, add a “sonnet” alias for complex tasks:

[models.sonnet]
provider = "anthropic"
model = "claude-sonnet-4-5"

Configuration Commands

Create configuration file:

Terminal window
uv run ash init

View current configuration:

Terminal window
uv run ash config show

Validate configuration:

Terminal window
uv run ash config validate

Configuration Sections

Each section has full documentation in the Systems pages:

SectionPurposeDocumentation
[models.*]LLM model definitionsLLM Providers
[telegram]Telegram bot settingsProviders
[sandbox]Docker sandbox settingsSandbox
[server]HTTP server settingsProviders
[memory]Memory and contextMemory
[sessions]Session managementProviders
[embeddings]Semantic searchMemory
[skills]Skill sources and configSkills
[agents.*]Agent overridesAgents
[brave_search]Web searchConfiguration Reference
[sentry]Error trackingConfiguration Reference