GraphChat
GraphChat is a natural language querying tool integrated into Memgraph Lab, designed to transform how users interact with graph databases. It translates English queries into Cypher and it’s designed for non-technical users while still catering to advanced developers.
By using Large Language Models (LLMs), such as OpenAI’s GPT-4, it translates natural language queries into Cypher commands, delivering precise and actionable results.
GraphChat is particularly beneficial for teams seeking faster insights, easier data exploration, and a more collaborative approach to graph database management. Whether you’re a data scientist exploring complex relationships or a business analyst seeking quick answers, GraphChat simplifies the querying process while maintaining its technical depth.
Features
-
Natural language to Cypher translation: Query the database without needing to learn Cypher.
-
Error handling and auto-retry: Automatically retries invalid Cypher queries by retrying up to three times (user-configurable).
-
Integration with multiple LLM providers:
- OpenAI: Use OpenAI’s GPT models with API key authentication.
- Azure OpenAI: Supports enterprise-grade LLM usage with Azure.
- Ollama: Enables local LLM integration for enhanced privacy.
-
Threaded conversations: Create multiple threads for distinct topics or comparisons (e.g., comparing outputs from different models). This helps keep discussions organized.
-
Conversation history context:
- Default: Last five messages for context-aware queries.
- Customizable: Users can define history length or exclude specific conversations.
-
Detailed customization:
- Supports various LLMs: (e.g., GPT-4, GPT-4o).
- Adjustable parameters: Temperature, retries, conversation depth, etc.
- Flexible Configuration: Configure hundreds of models for different tasks.
Setup Guide
Detailed instructions to get users started with GraphChat.
Prerequisites
- Ensure you have Memgraph and MAGE installed.
- You have prepared and imported your data into Memgraph.
- You have LLM provider credentials.
OpenAI
Use OpenAI’s models for processing natural language queries. Set up a connection to OpenAI by providing a valid OpenAPI key connection.
Azure OpenAI
Set up a connection to Azure OpenAI by providing:
azureOpenApiVersion
: Your Azure OpenAI service version. Find the list of versions here.azureOpenApiApiKey
: Your Azure OpenAI API keyazureOpenApiInstanceName
: Your Azure OpenAI instance nameazureOpenApiDeploymentName
: Your Azure OpenAI deployment name.
Additional Azure OpenAI integration details can be found in the Azure OpenAI documentation.
Ollama
For local LLM model setup, you can use Ollama:
- Provide the local endpoint URL, such as
http://localhost:11434
.
If you are having issues connecting to Ollama, try using host.docker.internal
instead of localhost
or 127.0.0.1
. Additional settings may be required if
you are using
Docker
or Docker Compose to run Memgraph and Memgraph Lab.
Learn more about Ollama and how to set it up for local LLM model use:
Ensure you follow the appropriate guidelines and documentation when setting up these connections to take full advantage of the GraphChat capabilities within Memgraph Lab.