tama.toml
tama.toml is the project configuration file, created by tama init in the project root.
Full example
Section titled “Full example”[project]name = "my-project"entrypoint = "researcher"
[models]thinker = "anthropic:claude-opus-4-6"writer = "anthropic:claude-sonnet-4-6"fast = "anthropic:claude-haiku-4-5"[project]
Section titled “[project]”name (required)
Section titled “name (required)”[project]name = "my-project"Project name. Lowercase letters, digits, hyphens.
entrypoint (required)
Section titled “entrypoint (required)”[project]entrypoint = "researcher"The agent that tama run starts from when you run:
tama run "task input"Override at runtime:
TAMA_ENTRYPOINT_AGENT=summarizer tama run "task input"[models]
Section titled “[models]”Maps role names to concrete models. Roles are referenced in agent call.model.role fields.
[models]thinker = "anthropic:claude-opus-4-6"writer = "anthropic:claude-sonnet-4-6"fast = "anthropic:claude-haiku-4-5"These values are overridden by environment variables of the form TAMA_MODEL_{ROLE}:
# override the "thinker" role at runtimeexport TAMA_MODEL_THINKER=anthropic:claude-sonnet-4-6Hyphens in role names map to underscores in env vars:
my-fast = "anthropic:claude-haiku-4-5"# env varexport TAMA_MODEL_MY_FAST=openai:gpt-4o-miniModel format
Section titled “Model format”All model references use provider:model-id format:
| Provider | Format | Example |
|---|---|---|
| Anthropic | anthropic:model-id | anthropic:claude-opus-4-6 |
| OpenAI | openai:model-id | openai:gpt-4o |
google:model-id | google:gemini-2.0-flash |
Environment variables
Section titled “Environment variables”| Variable | Description |
|---|---|
TAMA_ENTRYPOINT_AGENT | Override [project].entrypoint |
TAMA_MODEL_{ROLE} | Override a model role (e.g. TAMA_MODEL_THINKER) |
ANTHROPIC_API_KEY | Anthropic API key |
OPENAI_API_KEY | OpenAI API key |
GEMINI_API_KEY | Google Gemini API key |