aigne serve-mcp
The aigne serve-mcp
command transforms your AIGNE project into a server that speaks the Model Context Protocol (MCP). This makes your agents accessible over HTTP, allowing them to be integrated as tools into other applications, AI systems, or services that support MCP.
By exposing your agents via a standardized protocol, you can create modular, interoperable AI components that can be called just like any other API endpoint.
How It Works#
When you run aigne serve-mcp
, the CLI starts a lightweight Express.js server. This server listens for incoming HTTP POST requests on a specified endpoint. Here’s a simplified flow of what happens when a request comes in:
Essentially, each agent in your project is registered as a callable tool
within the MCP server, using the agent's name as the tool name and its inputSchema
to define the expected parameters.
Usage#
To start the MCP server, navigate to your project directory and run the command:
# Start MCP server with default settings
aigne serve-mcp
Options#
The serve-mcp
command offers several options to customize the server's behavior.
Option | Description | Default |
---|---|---|
| The local path or a remote URL to your AIGNE project directory. |
|
| The network host to bind the server to. Use |
|
| The port number for the server to listen on. |
|
| The specific URL path for the MCP service endpoint. |
|
Examples#
1. Running on a different port
If the default port 3000
is in use, you can specify another one.
aigne serve-mcp --port 3001
After running, you will see a confirmation message:
MCP server is running on http://localhost:3001/mcp
2. Serving a project from a specific directory
If you are not in the project directory, you can point the CLI to it using the --path
option.
aigne serve-mcp --path ./my-awesome-project
3. Exposing the server publicly
To allow other machines on your network to access the server, use 0.0.0.0
as the host.
aigne serve-mcp --host 0.0.0.0
Your server will then be accessible at http://<your-ip-address>:3000/mcp
.
Now that you know how to serve your agents as an API, the next step is often to monitor their activity and performance. You can do this using the observability server. Learn more in the next section: aigne observe.