Skip to content

agui_agent

Class info

Classes

Name Children Inherits
AGUIAgent
llmling_agent.agent.agui_agent
MessageNode that wraps a remote AG-UI protocol server.
    AGUISessionState
    llmling_agent.agent.agui_agent
    Track state for an active AG-UI session.
      ChatMessage
      llmling_agent.messaging.messages
      Common message format for all UI types.
        MessageNode
        llmling_agent.messaging.messagenode
        Base class for all message processing nodes.
        MessageStats
        llmling_agent.talk.stats
        Statistics for a single connection.
        StreamCompleteEvent
        llmling_agent.agent.events
        Event indicating streaming is complete with final message.

          🛈 DocStrings

          AG-UI remote agent implementation.

          This module provides a MessageNode adapter that connects to remote AG-UI protocol servers, enabling remote agent execution with streaming support.

          AGUIAgent

          Bases: MessageNode[TDeps, str]

          MessageNode that wraps a remote AG-UI protocol server.

          Connects to AG-UI compatible endpoints via HTTP/SSE and provides the same interface as native agents, enabling composition with other nodes via connections, teams, etc.

          The agent manages: - HTTP client lifecycle (create on enter, close on exit) - AG-UI protocol communication via SSE streams - Event conversion to native llmling-agent events - Message accumulation and final response generation

          Supports both blocking run() and streaming run_stream() execution modes.

          Example
          async with AGUIAgent(
              endpoint="http://localhost:8000/agent/run",
              name="remote-agent"
          ) as agent:
              result = await agent.run("Hello, world!")
              async for event in agent.run_stream("Tell me a story"):
                  print(event)
          
          Source code in src/llmling_agent/agent/agui_agent.py
           62
           63
           64
           65
           66
           67
           68
           69
           70
           71
           72
           73
           74
           75
           76
           77
           78
           79
           80
           81
           82
           83
           84
           85
           86
           87
           88
           89
           90
           91
           92
           93
           94
           95
           96
           97
           98
           99
          100
          101
          102
          103
          104
          105
          106
          107
          108
          109
          110
          111
          112
          113
          114
          115
          116
          117
          118
          119
          120
          121
          122
          123
          124
          125
          126
          127
          128
          129
          130
          131
          132
          133
          134
          135
          136
          137
          138
          139
          140
          141
          142
          143
          144
          145
          146
          147
          148
          149
          150
          151
          152
          153
          154
          155
          156
          157
          158
          159
          160
          161
          162
          163
          164
          165
          166
          167
          168
          169
          170
          171
          172
          173
          174
          175
          176
          177
          178
          179
          180
          181
          182
          183
          184
          185
          186
          187
          188
          189
          190
          191
          192
          193
          194
          195
          196
          197
          198
          199
          200
          201
          202
          203
          204
          205
          206
          207
          208
          209
          210
          211
          212
          213
          214
          215
          216
          217
          218
          219
          220
          221
          222
          223
          224
          225
          226
          227
          228
          229
          230
          231
          232
          233
          234
          235
          236
          237
          238
          239
          240
          241
          242
          243
          244
          245
          246
          247
          248
          249
          250
          251
          252
          253
          254
          255
          256
          257
          258
          259
          260
          261
          262
          263
          264
          265
          266
          267
          268
          269
          270
          271
          272
          273
          274
          275
          276
          277
          278
          279
          280
          281
          282
          283
          284
          285
          286
          287
          288
          289
          290
          291
          292
          293
          294
          295
          296
          297
          298
          299
          300
          301
          302
          303
          304
          305
          306
          307
          308
          309
          310
          311
          312
          313
          314
          315
          316
          317
          318
          319
          320
          321
          322
          323
          324
          325
          326
          327
          328
          329
          330
          331
          332
          333
          334
          335
          336
          337
          338
          339
          340
          341
          342
          343
          344
          345
          346
          347
          348
          349
          350
          351
          352
          353
          354
          355
          356
          357
          358
          359
          360
          361
          362
          363
          364
          365
          366
          367
          368
          369
          370
          371
          372
          373
          374
          375
          376
          377
          378
          379
          380
          381
          382
          383
          384
          385
          386
          387
          388
          389
          390
          391
          392
          class AGUIAgent[TDeps = None](MessageNode[TDeps, str]):
              """MessageNode that wraps a remote AG-UI protocol server.
          
              Connects to AG-UI compatible endpoints via HTTP/SSE and provides the same
              interface as native agents, enabling composition with other nodes via
              connections, teams, etc.
          
              The agent manages:
              - HTTP client lifecycle (create on enter, close on exit)
              - AG-UI protocol communication via SSE streams
              - Event conversion to native llmling-agent events
              - Message accumulation and final response generation
          
              Supports both blocking `run()` and streaming `run_stream()` execution modes.
          
              Example:
                  ```python
                  async with AGUIAgent(
                      endpoint="http://localhost:8000/agent/run",
                      name="remote-agent"
                  ) as agent:
                      result = await agent.run("Hello, world!")
                      async for event in agent.run_stream("Tell me a story"):
                          print(event)
                  ```
              """
          
              def __init__(
                  self,
                  endpoint: str,
                  *,
                  name: str = "agui-agent",
                  description: str | None = None,
                  display_name: str | None = None,
                  timeout: float = 60.0,
                  headers: dict[str, str] | None = None,
                  mcp_servers: Sequence[str | MCPServerConfig] | None = None,
                  agent_pool: AgentPool[Any] | None = None,
                  enable_logging: bool = True,
                  event_configs: Sequence[EventConfig] | None = None,
              ) -> None:
                  """Initialize AG-UI agent client.
          
                  Args:
                      endpoint: HTTP endpoint for the AG-UI agent
                      name: Agent name for identification
                      description: Agent description
                      display_name: Human-readable display name
                      timeout: Request timeout in seconds
                      headers: Additional HTTP headers
                      mcp_servers: MCP servers to connect
                      agent_pool: Agent pool for multi-agent coordination
                      enable_logging: Whether to enable database logging
                      event_configs: Event trigger configurations
                  """
                  super().__init__(
                      name=name,
                      description=description,
                      display_name=display_name,
                      mcp_servers=mcp_servers,
                      agent_pool=agent_pool,
                      enable_logging=enable_logging,
                      event_configs=event_configs,
                  )
                  self.endpoint = endpoint
                  self.timeout = timeout
                  self.headers = headers or {}
          
                  self._client: httpx.AsyncClient | None = None
                  self._state: AGUISessionState | None = None
                  self._message_count = 0
                  self._total_tokens = 0
          
              @property
              def context(self) -> NodeContext:
                  """Get node context."""
                  from llmling_agent.messaging.context import NodeContext
                  from llmling_agent.models.manifest import AgentsManifest
                  from llmling_agent_config.nodes import NodeConfig
          
                  return NodeContext(
                      node_name=self.name,
                      pool=self.agent_pool,
                      config=NodeConfig(name=self.name, description=self.description),
                      definition=AgentsManifest(),
                  )
          
              async def __aenter__(self) -> Self:
                  """Enter async context - initialize client and base resources."""
                  await super().__aenter__()
                  self._client = httpx.AsyncClient(
                      timeout=httpx.Timeout(self.timeout),
                      headers={
                          **self.headers,
                          "Accept": "text/event-stream",
                          "Content-Type": "application/json",
                      },
                  )
                  self._state = AGUISessionState(thread_id=self.conversation_id)
                  self.log.debug("AG-UI client initialized", endpoint=self.endpoint)
                  return self
          
              async def __aexit__(
                  self,
                  exc_type: type[BaseException] | None,
                  exc_val: BaseException | None,
                  exc_tb: TracebackType | None,
              ) -> None:
                  """Exit async context - cleanup client and base resources."""
                  if self._client:
                      await self._client.aclose()
                      self._client = None
                  self._state = None
                  self.log.debug("AG-UI client closed")
                  await super().__aexit__(exc_type, exc_val, exc_tb)
          
              async def run(
                  self,
                  *prompts: PromptCompatible,
                  message_id: str | None = None,
                  **kwargs: Any,
              ) -> ChatMessage[str]:
                  """Execute prompt against AG-UI agent.
          
                  Sends the prompt to the AG-UI server and waits for completion.
                  Events are collected and the final text content is returned as a ChatMessage.
          
                  Args:
                      prompts: Prompts to send (will be joined with spaces)
                      message_id: Optional message id for the returned message
                      **kwargs: Additional arguments (ignored for compatibility)
          
                  Returns:
                      ChatMessage containing the agent's aggregated text response
                  """
                  if not self._client or not self._state:
                      msg = "Agent not initialized - use async context manager"
                      raise RuntimeError(msg)
          
                  # Reset state for new run
                  self._state.text_chunks.clear()
                  self._state.thought_chunks.clear()
                  self._state.tool_calls.clear()
                  self._state.is_complete = False
                  self._state.error = None
                  self._state.run_id = str(uuid4())
          
                  # Collect all events
                  async for _ in self.run_stream(*prompts, message_id=message_id):
                      pass
          
                  # Create final message
                  self._message_count += 1
                  message = ChatMessage[str](
                      content="".join(self._state.text_chunks),
                      role="assistant",
                      name=self.name,
                      message_id=message_id or str(uuid4()),
                      conversation_id=self.conversation_id,
                      model_name=None,
                      cost_info=None,
                  )
                  self.message_sent.emit(message)
                  return message
          
              async def run_stream(
                  self,
                  *prompts: PromptCompatible,
                  message_id: str | None = None,
                  **kwargs: Any,
              ) -> AsyncIterator[RichAgentStreamEvent[str]]:
                  """Execute prompt with streaming events.
          
                  Args:
                      prompts: Prompts to send
                      message_id: Optional message ID
                      **kwargs: Additional arguments (ignored for compatibility)
          
                  Yields:
                      Native streaming events converted from AG-UI protocol
                  """
                  from ag_ui.core import RunAgentInput, UserMessage
          
                  if not self._client or not self._state:
                      msg = "Agent not initialized - use async context manager"
                      raise RuntimeError(msg)
          
                  # Reset state
                  self._state.text_chunks.clear()
                  self._state.thought_chunks.clear()
                  self._state.tool_calls.clear()
                  self._state.is_complete = False
                  self._state.error = None
                  self._state.run_id = str(uuid4())
          
                  # Build request
                  prompt_text = " ".join(str(p) for p in prompts)
                  user_message = UserMessage(id=str(uuid4()), content=prompt_text)
          
                  request_data = RunAgentInput(
                      thread_id=self._state.thread_id,
                      run_id=self._state.run_id,
                      state={},
                      messages=[user_message],
                      tools=[],
                      context=[],
                      forwarded_props={},
                  )
          
                  self.log.debug("Sending prompt to AG-UI agent", prompt=prompt_text[:100])
          
                  # Send request and stream events
                  try:
                      async with self._client.stream(
                          "POST",
                          self.endpoint,
                          json=request_data.model_dump(by_alias=True),
                      ) as response:
                          response.raise_for_status()
          
                          # Parse SSE stream
                          async for event in self._parse_sse_stream(response):
                              # Track text chunks
                              if text := extract_text_from_event(event):
                                  self._state.text_chunks.append(text)
          
                              # Convert to native event
                              if native_event := agui_to_native_event(event):
                                  yield native_event
          
                  except httpx.HTTPError as e:
                      self._state.error = str(e)
                      self.log.exception("HTTP error during AG-UI run")
                      raise
                  finally:
                      self._state.is_complete = True
          
                      # Emit final message
                      final_message = ChatMessage[str](
                          content="".join(self._state.text_chunks),
                          role="assistant",
                          name=self.name,
                          message_id=message_id or str(uuid4()),
                          conversation_id=self.conversation_id,
                          model_name=None,
                          cost_info=None,
                      )
                      yield StreamCompleteEvent(message=final_message)
          
              async def _parse_sse_stream(self, response: httpx.Response) -> AsyncIterator[Event]:
                  """Parse Server-Sent Events stream.
          
                  Args:
                      response: HTTP response with SSE stream
          
                  Yields:
                      Parsed AG-UI events
                  """
                  from ag_ui.core import Event
          
                  event_adapter: TypeAdapter[Event] = TypeAdapter(Event)
                  buffer = ""
          
                  async for chunk in response.aiter_text():
                      buffer += chunk
          
                      # Process complete SSE events
                      while "\n\n" in buffer:
                          event_text, buffer = buffer.split("\n\n", 1)
          
                          # Parse SSE format: "data: {json}\n"
                          for line in event_text.split("\n"):
                              if line.startswith("data: "):
                                  json_str = line[6:]  # Remove "data: " prefix
                                  try:
                                      event = event_adapter.validate_json(json_str)
                                      yield event
                                  except (ValueError, TypeError) as e:
                                      self.log.warning(
                                          "Failed to parse AG-UI event",
                                          json=json_str[:100],
                                          error=str(e),
                                      )
          
              async def run_iter(
                  self,
                  *prompt_groups: Sequence[PromptCompatible],
                  message_id: str | None = None,
                  **kwargs: Any,
              ) -> AsyncIterator[ChatMessage[str]]:
                  """Execute multiple prompt groups sequentially.
          
                  Args:
                      prompt_groups: Groups of prompts to execute
                      message_id: Optional message ID base
                      **kwargs: Additional arguments (ignored for compatibility)
          
                  Yields:
                      ChatMessage for each completed prompt group
                  """
                  for i, prompts in enumerate(prompt_groups):
                      mid = f"{message_id or 'msg'}_{i}" if message_id else None
                      yield await self.run(*prompts, message_id=mid)
          
              def to_tool(self, description: str | None = None) -> Callable[[str], Any]:
                  """Convert agent to a callable tool.
          
                  Args:
                      description: Tool description
          
                  Returns:
                      Async function that can be used as a tool
                  """
          
                  async def wrapped(prompt: str) -> str:
                      """Execute AG-UI agent with given prompt."""
                      result = await self.run(prompt)
                      return result.content
          
                  wrapped.__name__ = self.name
                  wrapped.__doc__ = description or f"Call {self.name} AG-UI agent"
                  return wrapped
          
              @property
              def model_name(self) -> str | None:
                  """Get model name (AG-UI doesn't expose this)."""
                  return None
          
              async def get_stats(self) -> MessageStats:
                  """Get message statistics for this node."""
                  return MessageStats()
          

          context property

          context: NodeContext
          

          Get node context.

          model_name property

          model_name: str | None
          

          Get model name (AG-UI doesn't expose this).

          __aenter__ async

          __aenter__() -> Self
          

          Enter async context - initialize client and base resources.

          Source code in src/llmling_agent/agent/agui_agent.py
          149
          150
          151
          152
          153
          154
          155
          156
          157
          158
          159
          160
          161
          162
          async def __aenter__(self) -> Self:
              """Enter async context - initialize client and base resources."""
              await super().__aenter__()
              self._client = httpx.AsyncClient(
                  timeout=httpx.Timeout(self.timeout),
                  headers={
                      **self.headers,
                      "Accept": "text/event-stream",
                      "Content-Type": "application/json",
                  },
              )
              self._state = AGUISessionState(thread_id=self.conversation_id)
              self.log.debug("AG-UI client initialized", endpoint=self.endpoint)
              return self
          

          __aexit__ async

          __aexit__(
              exc_type: type[BaseException] | None,
              exc_val: BaseException | None,
              exc_tb: TracebackType | None,
          ) -> None
          

          Exit async context - cleanup client and base resources.

          Source code in src/llmling_agent/agent/agui_agent.py
          164
          165
          166
          167
          168
          169
          170
          171
          172
          173
          174
          175
          176
          async def __aexit__(
              self,
              exc_type: type[BaseException] | None,
              exc_val: BaseException | None,
              exc_tb: TracebackType | None,
          ) -> None:
              """Exit async context - cleanup client and base resources."""
              if self._client:
                  await self._client.aclose()
                  self._client = None
              self._state = None
              self.log.debug("AG-UI client closed")
              await super().__aexit__(exc_type, exc_val, exc_tb)
          

          __init__

          __init__(
              endpoint: str,
              *,
              name: str = "agui-agent",
              description: str | None = None,
              display_name: str | None = None,
              timeout: float = 60.0,
              headers: dict[str, str] | None = None,
              mcp_servers: Sequence[str | MCPServerConfig] | None = None,
              agent_pool: AgentPool[Any] | None = None,
              enable_logging: bool = True,
              event_configs: Sequence[EventConfig] | None = None
          ) -> None
          

          Initialize AG-UI agent client.

          Parameters:

          Name Type Description Default
          endpoint str

          HTTP endpoint for the AG-UI agent

          required
          name str

          Agent name for identification

          'agui-agent'
          description str | None

          Agent description

          None
          display_name str | None

          Human-readable display name

          None
          timeout float

          Request timeout in seconds

          60.0
          headers dict[str, str] | None

          Additional HTTP headers

          None
          mcp_servers Sequence[str | MCPServerConfig] | None

          MCP servers to connect

          None
          agent_pool AgentPool[Any] | None

          Agent pool for multi-agent coordination

          None
          enable_logging bool

          Whether to enable database logging

          True
          event_configs Sequence[EventConfig] | None

          Event trigger configurations

          None
          Source code in src/llmling_agent/agent/agui_agent.py
           89
           90
           91
           92
           93
           94
           95
           96
           97
           98
           99
          100
          101
          102
          103
          104
          105
          106
          107
          108
          109
          110
          111
          112
          113
          114
          115
          116
          117
          118
          119
          120
          121
          122
          123
          124
          125
          126
          127
          128
          129
          130
          131
          132
          133
          def __init__(
              self,
              endpoint: str,
              *,
              name: str = "agui-agent",
              description: str | None = None,
              display_name: str | None = None,
              timeout: float = 60.0,
              headers: dict[str, str] | None = None,
              mcp_servers: Sequence[str | MCPServerConfig] | None = None,
              agent_pool: AgentPool[Any] | None = None,
              enable_logging: bool = True,
              event_configs: Sequence[EventConfig] | None = None,
          ) -> None:
              """Initialize AG-UI agent client.
          
              Args:
                  endpoint: HTTP endpoint for the AG-UI agent
                  name: Agent name for identification
                  description: Agent description
                  display_name: Human-readable display name
                  timeout: Request timeout in seconds
                  headers: Additional HTTP headers
                  mcp_servers: MCP servers to connect
                  agent_pool: Agent pool for multi-agent coordination
                  enable_logging: Whether to enable database logging
                  event_configs: Event trigger configurations
              """
              super().__init__(
                  name=name,
                  description=description,
                  display_name=display_name,
                  mcp_servers=mcp_servers,
                  agent_pool=agent_pool,
                  enable_logging=enable_logging,
                  event_configs=event_configs,
              )
              self.endpoint = endpoint
              self.timeout = timeout
              self.headers = headers or {}
          
              self._client: httpx.AsyncClient | None = None
              self._state: AGUISessionState | None = None
              self._message_count = 0
              self._total_tokens = 0
          

          get_stats async

          get_stats() -> MessageStats
          

          Get message statistics for this node.

          Source code in src/llmling_agent/agent/agui_agent.py
          390
          391
          392
          async def get_stats(self) -> MessageStats:
              """Get message statistics for this node."""
              return MessageStats()
          

          run async

          run(*prompts: PromptCompatible, message_id: str | None = None, **kwargs: Any) -> ChatMessage[str]
          

          Execute prompt against AG-UI agent.

          Sends the prompt to the AG-UI server and waits for completion. Events are collected and the final text content is returned as a ChatMessage.

          Parameters:

          Name Type Description Default
          prompts PromptCompatible

          Prompts to send (will be joined with spaces)

          ()
          message_id str | None

          Optional message id for the returned message

          None
          **kwargs Any

          Additional arguments (ignored for compatibility)

          {}

          Returns:

          Type Description
          ChatMessage[str]

          ChatMessage containing the agent's aggregated text response

          Source code in src/llmling_agent/agent/agui_agent.py
          178
          179
          180
          181
          182
          183
          184
          185
          186
          187
          188
          189
          190
          191
          192
          193
          194
          195
          196
          197
          198
          199
          200
          201
          202
          203
          204
          205
          206
          207
          208
          209
          210
          211
          212
          213
          214
          215
          216
          217
          218
          219
          220
          221
          222
          223
          224
          225
          async def run(
              self,
              *prompts: PromptCompatible,
              message_id: str | None = None,
              **kwargs: Any,
          ) -> ChatMessage[str]:
              """Execute prompt against AG-UI agent.
          
              Sends the prompt to the AG-UI server and waits for completion.
              Events are collected and the final text content is returned as a ChatMessage.
          
              Args:
                  prompts: Prompts to send (will be joined with spaces)
                  message_id: Optional message id for the returned message
                  **kwargs: Additional arguments (ignored for compatibility)
          
              Returns:
                  ChatMessage containing the agent's aggregated text response
              """
              if not self._client or not self._state:
                  msg = "Agent not initialized - use async context manager"
                  raise RuntimeError(msg)
          
              # Reset state for new run
              self._state.text_chunks.clear()
              self._state.thought_chunks.clear()
              self._state.tool_calls.clear()
              self._state.is_complete = False
              self._state.error = None
              self._state.run_id = str(uuid4())
          
              # Collect all events
              async for _ in self.run_stream(*prompts, message_id=message_id):
                  pass
          
              # Create final message
              self._message_count += 1
              message = ChatMessage[str](
                  content="".join(self._state.text_chunks),
                  role="assistant",
                  name=self.name,
                  message_id=message_id or str(uuid4()),
                  conversation_id=self.conversation_id,
                  model_name=None,
                  cost_info=None,
              )
              self.message_sent.emit(message)
              return message
          

          run_iter async

          run_iter(
              *prompt_groups: Sequence[PromptCompatible], message_id: str | None = None, **kwargs: Any
          ) -> AsyncIterator[ChatMessage[str]]
          

          Execute multiple prompt groups sequentially.

          Parameters:

          Name Type Description Default
          prompt_groups Sequence[PromptCompatible]

          Groups of prompts to execute

          ()
          message_id str | None

          Optional message ID base

          None
          **kwargs Any

          Additional arguments (ignored for compatibility)

          {}

          Yields:

          Type Description
          AsyncIterator[ChatMessage[str]]

          ChatMessage for each completed prompt group

          Source code in src/llmling_agent/agent/agui_agent.py
          346
          347
          348
          349
          350
          351
          352
          353
          354
          355
          356
          357
          358
          359
          360
          361
          362
          363
          364
          async def run_iter(
              self,
              *prompt_groups: Sequence[PromptCompatible],
              message_id: str | None = None,
              **kwargs: Any,
          ) -> AsyncIterator[ChatMessage[str]]:
              """Execute multiple prompt groups sequentially.
          
              Args:
                  prompt_groups: Groups of prompts to execute
                  message_id: Optional message ID base
                  **kwargs: Additional arguments (ignored for compatibility)
          
              Yields:
                  ChatMessage for each completed prompt group
              """
              for i, prompts in enumerate(prompt_groups):
                  mid = f"{message_id or 'msg'}_{i}" if message_id else None
                  yield await self.run(*prompts, message_id=mid)
          

          run_stream async

          run_stream(
              *prompts: PromptCompatible, message_id: str | None = None, **kwargs: Any
          ) -> AsyncIterator[RichAgentStreamEvent[str]]
          

          Execute prompt with streaming events.

          Parameters:

          Name Type Description Default
          prompts PromptCompatible

          Prompts to send

          ()
          message_id str | None

          Optional message ID

          None
          **kwargs Any

          Additional arguments (ignored for compatibility)

          {}

          Yields:

          Type Description
          AsyncIterator[RichAgentStreamEvent[str]]

          Native streaming events converted from AG-UI protocol

          Source code in src/llmling_agent/agent/agui_agent.py
          227
          228
          229
          230
          231
          232
          233
          234
          235
          236
          237
          238
          239
          240
          241
          242
          243
          244
          245
          246
          247
          248
          249
          250
          251
          252
          253
          254
          255
          256
          257
          258
          259
          260
          261
          262
          263
          264
          265
          266
          267
          268
          269
          270
          271
          272
          273
          274
          275
          276
          277
          278
          279
          280
          281
          282
          283
          284
          285
          286
          287
          288
          289
          290
          291
          292
          293
          294
          295
          296
          297
          298
          299
          300
          301
          302
          303
          304
          305
          306
          307
          308
          309
          async def run_stream(
              self,
              *prompts: PromptCompatible,
              message_id: str | None = None,
              **kwargs: Any,
          ) -> AsyncIterator[RichAgentStreamEvent[str]]:
              """Execute prompt with streaming events.
          
              Args:
                  prompts: Prompts to send
                  message_id: Optional message ID
                  **kwargs: Additional arguments (ignored for compatibility)
          
              Yields:
                  Native streaming events converted from AG-UI protocol
              """
              from ag_ui.core import RunAgentInput, UserMessage
          
              if not self._client or not self._state:
                  msg = "Agent not initialized - use async context manager"
                  raise RuntimeError(msg)
          
              # Reset state
              self._state.text_chunks.clear()
              self._state.thought_chunks.clear()
              self._state.tool_calls.clear()
              self._state.is_complete = False
              self._state.error = None
              self._state.run_id = str(uuid4())
          
              # Build request
              prompt_text = " ".join(str(p) for p in prompts)
              user_message = UserMessage(id=str(uuid4()), content=prompt_text)
          
              request_data = RunAgentInput(
                  thread_id=self._state.thread_id,
                  run_id=self._state.run_id,
                  state={},
                  messages=[user_message],
                  tools=[],
                  context=[],
                  forwarded_props={},
              )
          
              self.log.debug("Sending prompt to AG-UI agent", prompt=prompt_text[:100])
          
              # Send request and stream events
              try:
                  async with self._client.stream(
                      "POST",
                      self.endpoint,
                      json=request_data.model_dump(by_alias=True),
                  ) as response:
                      response.raise_for_status()
          
                      # Parse SSE stream
                      async for event in self._parse_sse_stream(response):
                          # Track text chunks
                          if text := extract_text_from_event(event):
                              self._state.text_chunks.append(text)
          
                          # Convert to native event
                          if native_event := agui_to_native_event(event):
                              yield native_event
          
              except httpx.HTTPError as e:
                  self._state.error = str(e)
                  self.log.exception("HTTP error during AG-UI run")
                  raise
              finally:
                  self._state.is_complete = True
          
                  # Emit final message
                  final_message = ChatMessage[str](
                      content="".join(self._state.text_chunks),
                      role="assistant",
                      name=self.name,
                      message_id=message_id or str(uuid4()),
                      conversation_id=self.conversation_id,
                      model_name=None,
                      cost_info=None,
                  )
                  yield StreamCompleteEvent(message=final_message)
          

          to_tool

          to_tool(description: str | None = None) -> Callable[[str], Any]
          

          Convert agent to a callable tool.

          Parameters:

          Name Type Description Default
          description str | None

          Tool description

          None

          Returns:

          Type Description
          Callable[[str], Any]

          Async function that can be used as a tool

          Source code in src/llmling_agent/agent/agui_agent.py
          366
          367
          368
          369
          370
          371
          372
          373
          374
          375
          376
          377
          378
          379
          380
          381
          382
          383
          def to_tool(self, description: str | None = None) -> Callable[[str], Any]:
              """Convert agent to a callable tool.
          
              Args:
                  description: Tool description
          
              Returns:
                  Async function that can be used as a tool
              """
          
              async def wrapped(prompt: str) -> str:
                  """Execute AG-UI agent with given prompt."""
                  result = await self.run(prompt)
                  return result.content
          
              wrapped.__name__ = self.name
              wrapped.__doc__ = description or f"Call {self.name} AG-UI agent"
              return wrapped
          

          AGUISessionState dataclass

          Track state for an active AG-UI session.

          Source code in src/llmling_agent/agent/agui_agent.py
          42
          43
          44
          45
          46
          47
          48
          49
          50
          51
          52
          53
          54
          55
          56
          57
          58
          59
          @dataclass
          class AGUISessionState:
              """Track state for an active AG-UI session."""
          
              thread_id: str
              """Thread ID for this session."""
              run_id: str | None = None
              """Current run ID."""
              text_chunks: list[str] = field(default_factory=list)
              """Accumulated text chunks."""
              thought_chunks: list[str] = field(default_factory=list)
              """Accumulated thought chunks."""
              tool_calls: dict[str, dict[str, Any]] = field(default_factory=dict)
              """Active tool calls by ID."""
              is_complete: bool = False
              """Whether the current run is complete."""
              error: str | None = None
              """Error message if run failed."""
          

          error class-attribute instance-attribute

          error: str | None = None
          

          Error message if run failed.

          is_complete class-attribute instance-attribute

          is_complete: bool = False
          

          Whether the current run is complete.

          run_id class-attribute instance-attribute

          run_id: str | None = None
          

          Current run ID.

          text_chunks class-attribute instance-attribute

          text_chunks: list[str] = field(default_factory=list)
          

          Accumulated text chunks.

          thought_chunks class-attribute instance-attribute

          thought_chunks: list[str] = field(default_factory=list)
          

          Accumulated thought chunks.

          thread_id instance-attribute

          thread_id: str
          

          Thread ID for this session.

          tool_calls class-attribute instance-attribute

          tool_calls: dict[str, dict[str, Any]] = field(default_factory=dict)
          

          Active tool calls by ID.

          main async

          main() -> None
          

          Example usage.

          Source code in src/llmling_agent/agent/agui_agent.py
          395
          396
          397
          398
          399
          400
          401
          402
          403
          404
          405
          406
          407
          408
          async def main() -> None:
              """Example usage."""
              async with AGUIAgent(
                  endpoint="http://localhost:8000/agent/run",
                  name="test-agent",
              ) as agent:
                  # Non-streaming
                  result = await agent.run("What is 2+2?")
                  print(f"Result: {result.content}")
          
                  # Streaming
                  print("\nStreaming:")
                  async for event in agent.run_stream("Tell me a short joke"):
                      print(f"Event: {event}")