Configuration
To connect the AIGNE CLI with various AI model providers, you need to provide the correct credentials and settings. This guide walks you through the setup process using environment variables, which is the primary method of configuration for AIGNE.
All configuration is managed in a .env.local
file at the root of your project. This file keeps your sensitive API keys separate from your codebase.
Setting Up Your Environment#
First, you need to create the configuration file. The AIGNE CLI includes an example file to get you started. Simply copy it to a new file named .env.local
:
cp .env.local.example .env.local
Now, you can edit the .env.local
file to add your API keys and select a model.
Here is the content of the .env.local.example
file for your reference:
# Change the name of this file to .env.local and fill in the following values
# Uncomment the lines below to enable debug logging
# DEBUG="aigne:*"
# Use different Models
# OpenAI
MODEL="openai:gpt-4.1"
OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
# Anthropic claude
# MODEL="anthropic:claude-3-7-sonnet-latest"
# ANTHROPIC_API_KEY=""
# Gemini
# MODEL="gemini:gemini-2.0-flash"
# GEMINI_API_KEY=""
# Bedrock nova
# MODEL=bedrock:us.amazon.nova-premier-v1:0
# AWS_ACCESS_KEY_ID=""
# AWS_SECRET_ACCESS_KEY=""
# AWS_REGION=us-west-2
# DeepSeek
# MODEL="deepseek:deepseek-chat"
# DEEPSEEK_API_KEY=""
# OpenRouter
# MODEL="openrouter:openai/gpt-4o"
# OPEN_ROUTER_API_KEY=""
# xAI
# MODEL="xai:grok-2-latest"
# XAI_API_KEY=""
# Ollama
# MODEL="ollama:llama3.2"
# OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
# Setup proxy if needed
# HTTPS_PROXY=http://localhost:7890
Configuring Model Providers#
The MODEL
environment variable determines which AI model provider and specific model will be used when you run an Agent. The format is typically provider:model-name
.
To use a specific provider, uncomment its MODEL
and API key variables in your .env.local
file and fill in your credentials. The table below details the supported providers and the necessary environment variables for each.
Provider |
| Required Environment Variables |
---|---|---|
OpenAI |
|
|
Anthropic |
|
|
Gemini |
|
|
AWS Bedrock |
|
|
DeepSeek |
|
|
OpenRouter |
|
|
xAI |
|
|
Ollama |
|
|
Proxy Configuration#
If you are behind a corporate firewall or need to route network traffic through a proxy, you can set the HTTPS_PROXY
environment variable in your .env.local
file.
# Setup proxy if needed
HTTPS_PROXY=http://localhost:7890
AIGNE CLI also recognizes standard system-level proxy variables like http_proxy
, https_proxy
, and all_proxy
if they are set in your shell environment.
With your environment configured, you're ready to start building. The next step is to create your first project and run an Agent.
Continue to our Getting Started guide to learn how.