The GraphChat allows you to query the database using the English language, rather than Cypher queries.

All the LLM settings can be adjusted in the Settings section. GraphChat supports various LLM connection options.

Before using GraphChat, ensure the following:

  • The MAGE graph algorithm library is installed on your Memgraph instance
  • The database is populated with your data.


Use OpenAI's models for processing natural language queries. Set up a connection to OpenAI by providing:

  • A valid OpenAPI key connection

Azure OpenAI

Set up a connection to Azure OpenAI by providing:

  • azureOpenApiVersion: Your Azure OpenAI service version. Find the list of versions here (opens in a new tab).
  • azureOpenApiApiKey: Your Azure OpenAI API key
  • azureOpenApiInstanceName: Your Azure OpenAI instance name
  • azureOpenApiDeploymentName: Your Azure OpenAI deployment name.

Additional Azure OpenAI integration details can be found in the Azure OpenAI documentation (opens in a new tab).


For local LLM model setup, Ollama can be used:

  • Provide the local endpoint URL, such as http://localhost:11434.

Learn more about Ollama and how to set it up for local LLM model use:

Ensure you follow the appropriate guidelines and documentation when setting up these connections to take full advantage of the GraphChat capabilities within Memgraph Lab.