llm
The llm module provides a function to send text to a large language model (LLM) and return the completion directly from Cypher. It uses LiteLLM for model-agnostic completion, so you can use OpenAI, Anthropic, Ollama, and other providers with the same interface.
| Trait | Value |
|---|---|
| Module type | util |
| Implementation | Python |
| Parallelism | sequential |
Requirement: Install LiteLLM in the MAGE Python environment: pip install litellm.
Set the API key for your provider (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY). For local providers like Ollama, you can set api_base in the config or LITELLM_API_BASE in the environment.
Functions
complete()
Sends text to an LLM and returns the completion as a string (e.g. for summarization or text generation).
Input:
text: str➡ Input text to send to the LLM (e.g. concatenated node text to summarize or complete).config: Map(OPTIONAL) ➡ Configuration map. Defaults to{}.
Config options:
| Name | Type | Description |
|---|---|---|
model | string | Model name (e.g. ollama/llama2, openai/gpt-4o-mini). Falls back to env LITELLM_MODEL or a default. |
api_base | string | Base URL for the API (e.g. http://localhost:11434 for Ollama). Can also be set via LITELLM_API_BASE. |
system_prompt | string | System prompt for the completion. Default: "Complete the following text." |
Output:
- Returns
str➡ The model’s completion (trimmed string). Empty string if the model returns no content.
Usage:
Using default or environment configuration:
RETURN llm.complete("Summarize in one sentence: Memgraph is a graph database.");With explicit config (e.g. Ollama):
RETURN llm.complete("Hello", {model: "ollama/llama2", api_base: "http://localhost:11434"});With a custom system prompt:
RETURN llm.complete(
"List the main topics.",
{system_prompt: "You are a concise assistant. Reply with a short bullet list."}
);Using graph data (e.g. summarize node text):
MATCH (n:Article)
WITH collect(n.title + ": " + n.abstract) AS texts
WITH reduce(s = "", t IN texts | s + t + "\n") AS combined
RETURN llm.complete(combined, {system_prompt: "Summarize the following articles in 2 sentences."}) AS summary;