Skip to content

serve-api

serve-api

Run agents as a completions API server.

This creates an OpenAI-compatible API server that makes your agents available through a standard completions API interface.

llmling agent serve-api [OPTIONS] [CONFIG]

config

--host

Default: localhost Host to bind server to

--port

Default: 8000 Port to listen on

--cors

Default: True Flag Enable CORS

--show-messages

Flag Show message activity

--docs

Default: True Flag Enable API documentation

--help

Flag Show this message and exit.