Used to check for browser translation.
用于检测浏览器翻译。
ブラウザの翻訳を検出する

Overview

Reference

The AIGNE Command Line Interface (CLI) is a comprehensive tool for managing your AIGNE Framework projects directly from your terminal. It provides commands to create, run, test, and deploy your AI Agent, helping to streamline the entire development lifecycle from initial setup to production integration.

How It Works#

The AIGNE CLI acts as the bridge between your project's configuration and the AIGNE runtime engine. When you execute a command like aigne run, the CLI first parses your project's aigne.yaml file to understand its structure, including the defined Agent and tools. It then leverages the @aigne/core library to load these components, establish connections with the specified AI models, and manage the state of the conversation or task. For performance, the CLI uses a Bun wrapper to ensure a fast runtime, automatically falling back to npx if Bun is not installed on your system.

The diagram below illustrates how the CLI interacts with other parts of the AIGNE ecosystem.

Uses

Executes Commands

aigne create

aigne run

aigne test

aigne serve-mcp

aigne observe

Loads

Utilizes

Connects to

Exposes Agent via

Displays Data from

Developer

AIGNE CLI

aigne [command]

New AIGNE Project Files

AIGNE Runtime

Test Runner (node --test)

MCP Server

AIGNE Observability UI

@aigne/core Engine

AI Model Providers (OpenAI, Anthropic, etc.)

HTTP/MCP Interface


Key Features#

The CLI is designed to cover all stages of Agent development. Here are its main capabilities:

Feature

Command

Description

Project Scaffolding

aigne create

Initializes a new AIGNE project using a pre-defined template, setting up a standard directory structure and configuration files.

Agent Execution

aigne run

Runs an Agent from local files or a remote URL. It also supports an interactive chat mode for direct testing and conversation.

Integrated Testing

aigne test

Executes node:test test suites within your project to verify the functionality of your Agent and tools.

MCP Service

aigne serve-mcp

Exposes your Agent as a Model Context Protocol (MCP) server, allowing them to be integrated with other applications and services.

Performance Monitoring

aigne observe

Launches the AIGNE Observability web server to monitor and analyze Agent execution traces and data.

Multi-Model Support

(Configuration)

Supports a wide range of AI model providers, including OpenAI, Anthropic, Bedrock, Gemini, XAI, Ollama, and more.

This overview gives you a look at what the AIGNE CLI can do. The next step is to install it and build your first Agent.

➡️ Next: Getting Started