Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', [])

In the realm of code, where knowledge flows,
Stands a tool named MkDocs, as the river goes.
With words like building blocks, both sturdy and bright,
It crafts handy docs, through day and through night.

From Markdown’s embrace, its structure unfolds,
A symphony of text, where clarity holds.
With each gentle tag, a header takes flight,
Creating outlines that guide with delight.

A theme, like a canvas, awaits your design,
Customization dances, your vision in line.
With a simple command, your site comes alive,
Documentation blooms, where ideas derive.

Responsive and clean, on screens big and small,
Navigable pathways invite one and all.
With search threads entwined, the knowledge refined,
A quest through the pages, where answers you’ll find.

From beginners to experts, each voice has a space,
In the heart of MkDocs, we all find our place.
So here’s to the journeys that words can ignite,
With MkDocs, our stories take wing and take flight.

In the realm of code, where knowledge flows,  
Stands a tool named MkDocs, as the river goes.  
With words like building blocks, both sturdy and bright,  
It crafts handy docs, through day and through night.  

From Markdown’s embrace, its structure unfolds,  
A symphony of text, where clarity holds.  
With each gentle tag, a header takes flight,  
Creating outlines that guide with delight.  

A theme, like a canvas, awaits your design,  
Customization dances, your vision in line.  
With a simple command, your site comes alive,  
Documentation blooms, where ideas derive.  

Responsive and clean, on screens big and small,  
Navigable pathways invite one and all.  
With search threads entwined, the knowledge refined,  
A quest through the pages, where answers you’ll find.  

From beginners to experts, each voice has a space,  
In the heart of MkDocs, we all find our place.  
So here’s to the journeys that words can ignite,  
With MkDocs, our stories take wing and take flight.  
<p>In the realm of code, where knowledge flows,<br>
Stands a tool named MkDocs, as the river goes.<br>
With words like building blocks, both sturdy and bright,<br>
It crafts handy docs, through day and through night.  </p>
<p>From Markdown’s embrace, its structure unfolds,<br>
A symphony of text, where clarity holds.<br>
With each gentle tag, a header takes flight,<br>
Creating outlines that guide with delight.  </p>
<p>A theme, like a canvas, awaits your design,<br>
Customization dances, your vision in line.<br>
With a simple command, your site comes alive,<br>
Documentation blooms, where ideas derive.  </p>
<p>Responsive and clean, on screens big and small,<br>
Navigable pathways invite one and all.<br>
With search threads entwined, the knowledge refined,<br>
A quest through the pages, where answers you’ll find.  </p>
<p>From beginners to experts, each voice has a space,<br>
In the heart of MkDocs, we all find our place.<br>
So here’s to the journeys that words can ignite,<br>
With MkDocs, our stories take wing and take flight.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94270158823376["mkllm.MkLlm"]
  94270155566640["mktext.MkText"]
  94270153212096["mknode.MkNode"]
  94270159179120["node.Node"]
  139968908570848["builtins.object"]
  94270155566640 --> 94270158823376
  94270153212096 --> 94270155566640
  94270159179120 --> 94270153212096
  139968908570848 --> 94270159179120
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )