Configuration
AgentiCode loads configuration from multiple sources with priority: CLI Options > Environment Variables > Config Files > API > Defaults
Config File Locations
Checked in order:
.awcode.json(working directory).awcode/config.json(working directory)~/.awcode/config.json(home directory)AgenticWork API
/api/awcode/config(if connected)
Config File Format
{
"apiEndpoint": "https://api.agenticwork.io",
"apiKey": "your-api-key",
"model": "auto",
"temperature": 0.7,
"maxTokens": 4096,
"maxHistoryLength": 100,
"maxTurns": 20,
"features": {
"shellEnabled": true,
"fileWriteEnabled": true,
"webSearchEnabled": true,
"mcpEnabled": true,
"codeExecutionEnabled": false
},
"mcpServers": [
{
"name": "custom-server",
"command": "node",
"args": ["/path/to/server.js"],
"env": {
"API_KEY": "value"
}
}
],
"telemetry": {
"enabled": true,
"endpoint": "http://localhost:4318/v1/traces"
},
"ui": {
"theme": "auto",
"showTokenUsage": true,
"streamOutput": true
}
}Environment Variables
API Configuration
Model Configuration
Feature Flags
System Prompt
Session
Configuration Options
apiEndpoint
AgenticWork API URL for authentication and features.
apiKey
API key for authentication.
model
Default model to use. Can be a model name or preset.
temperature
LLM temperature (0-2). Lower = more deterministic.
maxTokens
Maximum tokens per response.
maxHistoryLength
Maximum messages to keep in history.
maxTurns
Maximum conversation turns before requiring reset.
features
Enable/disable specific features:
mcpServers
Custom MCP server configurations:
ui
UI preferences:
telemetry
Telemetry configuration:
Provider-Specific Configuration
Ollama
AgenticWork API
Auto (Default)
Auto-detection priority:
AgenticWork API (if configured)
Ollama (if running locally)
Project-Level Configuration
Create .awcode.json in your project root for project-specific settings:
AGENTICODE.md
Create AGENTICODE.md in your project root (or run /init) to provide project context:
This file is automatically loaded and provides context to the AI.
Last updated