Environment Variables

All environment variables available to agent pods.


Operator-injected variables

These are set by the operator based on the ArkAgent and ArkTeam spec. Do not set these manually — the operator manages them.

Variable Description
AGENT_MODEL Model ID from spec.model (e.g. llama3.2, gpt-4o)
AGENT_SYSTEM_PROMPT Resolved system prompt (inline or from ConfigMap, merged with ArkSettings fragments)
AGENT_MCP_SERVERS JSON array of {name, url, headers} — one entry per MCP server in spec.mcpServers
AGENT_WEBHOOK_TOOLS JSON array of inline webhook tool definitions from spec.tools
AGENT_MAX_TOKENS From spec.limits.maxTokensPerCall (default: 8000)
AGENT_TIMEOUT_SECONDS From spec.limits.timeoutSeconds (default: 120)
AGENT_MAX_CONCURRENT_TASKS From spec.limits.maxConcurrentTasks (default: 5)
TASK_QUEUE_URL Redis connection string for the task queue (e.g. redis://redis.ark-system.svc.cluster.local:6379)
STREAM_CHANNEL_URL Redis connection string for the streaming channel. Defaults to TASK_QUEUE_URL.
AGENT_TEAM_ROUTES (ArkTeam context) JSON map of role → queueURL for delegate() tool routing

Memory variables (operator-injected when memoryRef is set)

Variable Description
AGENT_MEMORY_BACKEND in-context, redis, or vector-store
AGENT_MEMORY_REDIS_URL Redis connection URL (from the referenced Secret, redis backend)
AGENT_MEMORY_REDIS_TTL TTL in seconds
AGENT_MEMORY_REDIS_MAX_ENTRIES Entry cap
AGENT_MEMORY_VECTOR_STORE_PROVIDER qdrant, pinecone, or weaviate
AGENT_MEMORY_VECTOR_STORE_ENDPOINT Base URL
AGENT_MEMORY_VECTOR_STORE_COLLECTION Collection name
AGENT_MEMORY_VECTOR_STORE_API_KEY From the referenced Secret (if set)
AGENT_MEMORY_VECTOR_STORE_TTL TTL in seconds

User-configurable variables

Set these via the apiKeys.existingSecret Kubernetes Secret, or via agentExtraEnv in Helm values. They are forwarded to every agent pod.

LLM provider credentials

Variable Description
ANTHROPIC_API_KEY API key for the Anthropic provider. Required when any agent uses a claude-* model.
OPENAI_API_KEY API key for the OpenAI provider. Required when any agent uses a gpt-*/o* model. Also set to any non-empty value (e.g. ollama) when using Ollama.
OPENAI_BASE_URL Override the OpenAI-compatible API endpoint. Set to http://ollama.svc:11434/v1 for Ollama, or any other compatible endpoint.

Provider selection

Variable Description
AGENT_PROVIDER Override provider selection: auto (default), anthropic, openai, or mock. Use auto to let the agent runtime infer from the model name.

Observability

Variable Description
OTEL_EXPORTER_OTLP_ENDPOINT OTel collector endpoint (e.g. http://jaeger.monitoring.svc:4318). Enables distributed tracing and metrics export when set.
OTEL_SERVICE_NAME Service name reported in traces. Default: ark-agent.

Setting user-configurable variables

helm upgrade ark-operator arkonis/ark-operator \
  --set agentExtraEnv[0].name=AGENT_PROVIDER,agentExtraEnv[0].value=openai \
  --set agentExtraEnv[1].name=OPENAI_BASE_URL,agentExtraEnv[1].value=http://ollama.svc:11434/v1 \
  --set agentExtraEnv[2].name=OPENAI_API_KEY,agentExtraEnv[2].value=ollama

Via Kubernetes Secret

kubectl create secret generic ark-api-keys \
  --from-literal=OPENAI_API_KEY=sk-... \
  --from-literal=ANTHROPIC_API_KEY=sk-ant-... \
  --namespace my-org

Reference the secret in Helm:

helm upgrade ark-operator arkonis/ark-operator \
  --set apiKeys.existingSecret=ark-api-keys

See also


Apache 2.0 · ARKONIS