Skip to content

Prompt hubs are external platforms for managing, versioning, and sharing prompts. LLMling-Agent integrates with popular prompt management platforms to leverage curated prompt libraries and collaborative prompt development.

Overview

LLMling-Agent supports integration with leading prompt hub platforms:

  • PromptLayer: Comprehensive prompt management with versioning and analytics
  • Langfuse: Open-source LLM engineering platform with prompt management
  • Fabric: Community-driven prompt patterns and templates
  • Braintrust: Enterprise prompt management with evaluation and testing

These integrations allow you to fetch prompts from these services by identifiers.

Configuration Reference

PromptLayerConfig

Configuration for PromptLayer prompt provider.

PromptLayerConfig (YAML)
- type: promptlayer
  api_key: '!ENV ENV_VAR_NAME'  # API key for the PromptLayer API.

LangfuseConfig

Configuration for Langfuse prompt provider.

LangfuseConfig (YAML)
1
2
3
4
5
6
7
8
9
- type: langfuse
  secret_key: '!ENV ENV_VAR_NAME'  # Secret key for the Langfuse API.
  public_key: '!ENV ENV_VAR_NAME'  # Public key for the Langfuse API.
  host: !!python/object:pydantic.networks.HttpUrl  # Langfuse host address.
    _url: !!python/object/new:pydantic_core._pydantic_core.Url
    - https://cloud.langfuse.com/
  cache_ttl_seconds: 60  # Cache TTL for responses in seconds.
  max_retries: 2  # Maximum number of retries for failed requests.
  fetch_timeout_seconds: 20  # Timeout for fetching responses in seconds.

FabricConfig

Configuration for Fabric GitHub prompt provider.

FabricConfig (YAML)
- type: fabric

BraintrustConfig

Configuration for Braintrust prompt provider.

BraintrustConfig (YAML)
1
2
3
- type: braintrust
  api_key: null  # API key for the Braintrust API. Defaults to BRAINTRUST_API_KEY env var
  project: null  # Braintrust Project name.

Configuration Notes

  • API keys should be stored in environment variables
  • Prompts from hubs can be referenced in agent system_prompts
  • Hub integration provides automatic prompt versioning
  • Some platforms offer additional features like prompt analytics and testing
  • Prompts can be managed both locally and in external hubs