Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

In the realm of code and text shall rise,
A tool for the crafted, under open skies,
MkDocs, the keeper of documentation bright,
Turns words into form, brings knowledge to light.

With a simple command, from the terminal's grip,
You weave markdown magic, let your thoughts slip,
Into pages that render with elegance clear,
User-friendly portals, where wisdom draws near.

Themes intertwined like a tapestry spun,
Choosing the palette, let your vision run,
From default to custom, your style is conveyed,
A world of reflection where ideas cascade.

Navigation flows like a river's soft bend,
Organized sections, your message to send,
Search bars that glisten with answers so swift,
Guiding the curious through your knowledge gift.

In the quiet of coding, where aspirations bloom,
Collaboration thrives, for there's always room,
For voices of many, in harmony speak,
Harnessing power, with each word unique.

So here’s to the builders, the dreamers who dare,
To document visions, to share and to care,
With MkDocs as the vessel, let knowledge embark,
Illuminating pathways, igniting the spark.

From tech to community, each topic embraced,
Preserving the wisdom that can't be replaced,
In the digital oceans where seekers set forth,
MkDocs is the lighthouse, a beacon of worth.

In the realm of code and text shall rise,  
A tool for the crafted, under open skies,  
MkDocs, the keeper of documentation bright,  
Turns words into form, brings knowledge to light.  

With a simple command, from the terminal's grip,  
You weave markdown magic, let your thoughts slip,  
Into pages that render with elegance clear,  
User-friendly portals, where wisdom draws near.  

Themes intertwined like a tapestry spun,  
Choosing the palette, let your vision run,  
From default to custom, your style is conveyed,  
A world of reflection where ideas cascade.  

Navigation flows like a river's soft bend,  
Organized sections, your message to send,  
Search bars that glisten with answers so swift,  
Guiding the curious through your knowledge gift.  

In the quiet of coding, where aspirations bloom,  
Collaboration thrives, for there's always room,  
For voices of many, in harmony speak,  
Harnessing power, with each word unique.  

So here’s to the builders, the dreamers who dare,  
To document visions, to share and to care,  
With MkDocs as the vessel, let knowledge embark,  
Illuminating pathways, igniting the spark.  

From tech to community, each topic embraced,  
Preserving the wisdom that can't be replaced,  
In the digital oceans where seekers set forth,  
MkDocs is the lighthouse, a beacon of worth.  
<p>In the realm of code and text shall rise,<br>
A tool for the crafted, under open skies,<br>
MkDocs, the keeper of documentation bright,<br>
Turns words into form, brings knowledge to light.  </p>
<p>With a simple command, from the terminal's grip,<br>
You weave markdown magic, let your thoughts slip,<br>
Into pages that render with elegance clear,<br>
User-friendly portals, where wisdom draws near.  </p>
<p>Themes intertwined like a tapestry spun,<br>
Choosing the palette, let your vision run,<br>
From default to custom, your style is conveyed,<br>
A world of reflection where ideas cascade.  </p>
<p>Navigation flows like a river's soft bend,<br>
Organized sections, your message to send,<br>
Search bars that glisten with answers so swift,<br>
Guiding the curious through your knowledge gift.  </p>
<p>In the quiet of coding, where aspirations bloom,<br>
Collaboration thrives, for there's always room,<br>
For voices of many, in harmony speak,<br>
Harnessing power, with each word unique.  </p>
<p>So here’s to the builders, the dreamers who dare,<br>
To document visions, to share and to care,<br>
With MkDocs as the vessel, let knowledge embark,<br>
Illuminating pathways, igniting the spark.  </p>
<p>From tech to community, each topic embraced,<br>
Preserving the wisdom that can't be replaced,<br>
In the digital oceans where seekers set forth,<br>
MkDocs is the lighthouse, a beacon of worth.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}

_process_extra_files

_process_extra_files() -> list[str]
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94596170149008["mkllm.MkLlm"]
  94596170177680["mktext.MkText"]
  94596169136704["mknode.MkNode"]
  94596171773984["node.Node"]
  139930746687680["builtins.object"]
  94596170177680 --> 94596170149008
  94596169136704 --> 94596170177680
  94596171773984 --> 94596169136704
  139930746687680 --> 94596171773984
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )