GraphChat

The GraphChat allows you to query the database using the English language, rather than Cypher queries.

All the LLM settings can be adjusted in the Settings section. GraphChat supports various LLM connection options.

Before using GraphChat, ensure the following:

  • The MAGE graph algorithm library is installed on your Memgraph instance
  • The database is populated with your data.

OpenAI

Use OpenAI's models for processing natural language queries. Set up a connection to OpenAI by providing:

  • A valid OpenAPI key connection

Azure OpenAI

Set up a connection to Azure OpenAI by providing:

  • azureOpenApiVersion: Your Azure OpenAI service version. Find the list of versions here (opens in a new tab).
  • azureOpenApiApiKey: Your Azure OpenAI API key
  • azureOpenApiInstanceName: Your Azure OpenAI instance name
  • azureOpenApiDeploymentName: Your Azure OpenAI deployment name.

Additional Azure OpenAI integration details can be found in the Azure OpenAI documentation (opens in a new tab).

Ollama

For local LLM model setup, you can use Ollama:

  • Provide the local endpoint URL, such as http://localhost:11434.
💡

If you are having issues connecting to Ollama, try using host.docker.internal instead of localhost or 127.0.0.1. Additional settings may be required if you are using Docker or Docker Compose to run Memgraph and Memgraph Lab.

Learn more about Ollama and how to set it up for local LLM model use:

Ensure you follow the appropriate guidelines and documentation when setting up these connections to take full advantage of the GraphChat capabilities within Memgraph Lab.