
Memgraph Lab 3.0: What’s Changed and What’s Next
If you’ve used Memgraph, chances are you’ve also used Memgraph Lab. It’s more than just a database client—it’s the go-to tool for exploring, querying, and visualizing your graph data. And now, with the release of Memgraph Lab 3.0, things are getting even better.
In our latest webinar, Toni Lastre, Head of Platform Engineering at Memgraph, joined Katarina Supe, Head of Developer Experience, to break down what’s new in Memgraph Lab 3.0 and where we’re headed next.
Spoiler: It’s not just a facelift, but yes, we finally gave you the dark mode you asked for. ;)
Memgraph Lab 3.0 brings powerful new features that make working with Memgraph faster, smoother, and more intuitive. From improved query handling to a revamped UI, this update is all about making graph development easier for engineers like you.
Read on for a recap of the key highlights, or watch the full webinar to see Lab 3.0 in action:
Improved GraphChat: More Control, Better Debugging
One of the biggest updates in Lab 3.0 revolves around GraphChat, the AI-powered assistant inside Memgraph Lab. If you missed the initial release, it’s basically your graph database copilot—helping you craft Cypher queries, debug issues, and interact with your data in a more conversational way.
Here’s what’s new:
- Parallel chat. Before Lab 3.0, GraphChat was limited to a single, linear chat history. If you wanted to test how different models handled queries, you had to manually swap them out and keep track of responses yourself. Not ideal. Now, Lab 3.0 introduces multiple chat threads and parallel layouts. It means you can test two different models side by side and compare their responses in real time. This is especially useful for debugging, optimizing prompts, and fine-tuning LLM configurations to get the most accurate Cypher queries.
- Message threads. Group messages into threads for better organization and context retention. No more scrolling through long, disjointed conversations.
- Conversation context management. GraphChat now remembers the conversation history, so you can ask follow-ups like “Can you tell me more about that node?”—and it knows what you mean. But if something feels off, you can thumbs down a message to remove it from the history and prevent it from influencing future responses.
- Full Debugging transparency. Ever wondered how GraphChat translates your prompts into Cypher? Now you can see every step in the query generation process—including how it recovers from errors when Cypher queries fail.
- Schema awareness. Schema plays a huge role in generating accurate queries, and now you can check exactly which schema GraphChat is using at any time.
- Retry logic & configuration. By default, GraphChat will retry up to three times if a Cypher query is invalid (though it won’t retry if the query is correct but returns no results). Within this setting, you can configure the maximum number of retries, the size of conversations history, how many past messages GraphChat considers when forming a response.
Remote Storage: Team Collaboration for Queries and Graph Styles
Memgraph Lab has always been a single-user experience, where everything you do—queries, graph styles, configurations—lives in your browser. That’s fine if you’re working solo, but as soon as teams get involved, things get messy.
What if two developers, for example, Alice and John are working on the same database instance Alice updates a Graph Style Script (GSS) to improve fraud detection visuals. John makes a tweak to optimize query execution. But they’re using separate browsers, meaning neither of them sees the other’s work unless they manually copy and paste their changes. That’s where Remote Storage comes in.
Teams can now save and share their work centrally inside Memgraph, making it easy to collaborate in real-time. You can save GSS centrally so Alice’s updates are instantly visible to John and vice versa. Store and share queries allow you to collaborate on complex Cypher queries without dealing with version conflicts.
You can also choose storage options—keep your work private just for yourself, and unlike before, it stays available even if you switch browsers.
Custom AI Model Configuration & Multi-Provider Support
Memgraph Lab 3.0 also gives you complete control over how AI models are configured and used. This means you can do the following:
- Define AI providers & models—Connect to multiple LLMs at once, including OpenAI, Ollama, and Azure OpenAI.
- Customize LLM parameters—Adjust temperature, token limits, retry thresholds, and more.
- Set AI workflows beyond GraphChat—This is just the beginning. Future Lab versions will integrate AI into query module generation, schema modeling, and data imports.
Our goal is for AI to assist in every step of your graph workflow, from writing queries to visualizing results without locking you into a single provider or model.
Real-Time Schema Updates
Memgraph Lab 3.0 isn’t just about usability improvements—it also brings deeper insights into your data, performance, and cluster management.
If you’ve worked with dynamic datasets before, you know schema exploration can be a pain. Traditionally, querying schema details required running expensive scans across large datasets—far from ideal in a real-time environment.
Memgraph now automatically updates schema details as you add new nodes and relationships, so you can:
- Instantly see counts of nodes and relationships as they’re created.
- View detailed property distributions—how many nodes have a certain property, how often it’s populated, and what types are present.
- Avoid unnecessary schema recalculations, making querying faster and more efficient.
This is especially useful if you’re building knowledge graphs from multiple sources, where property types may vary. Memgraph Lab 3.0 integrates directly with this feature, displaying schema insights in the sidebar as soon as you connect.
If real-time schema updates are disabled (for performance reasons), Lab will still fetch whatever schema details it can—without putting too much strain on your database.
Performance Warnings and Optimizations Alerts
A graph database is only as good as its performance. And yet, some users unknowingly run Memgraph with suboptimal settings, causing slow queries and stability issues.
One common issue? Max Map Count.
Memgraph relies on memory-mapped files, and without adjusting this setting, it can hit memory limits and crash under heavy workloads.
To prevent this, Lab 3.0 now automatically checks if Max Map Count is set correctly. If not, you’ll see a warning banner explaining the issue. Lab will provide a guide to fixing it, ensuring stability and peak performance.
We expect that future versions of Lab will introduce more intelligent performance checks to proactively prevent bottlenecks before they happen.
Cluster Management: Leader and Follower Visibility
If you’re running a highly available Memgraph cluster, Lab 3.0 makes it easier to manage.
Previously, when connecting Lab to a cluster coordinator, users couldn’t immediately tell whether they were interacting with a leader or a follower.
Now, Lab 3.0 clearly displays the role of the node. You instantly know where to run instance management queries, which node is responsible for writes, and how to register new instances. This means no more running write queries on a follower node by mistake—saving time and reducing errors in a distributed setup.
For teams managing distributed environments, this small change eliminates a lot of trial and error, making cluster administration far more intuitive.
Watch the entire webinar recording—Intro to Memgraph Lab 3.0 and Demo
What’s Next in Memgraph Lab?
In the upcoming future, Lab is becoming a fully AI-assisted graph platform and we’ll be focusing on the following:
- More AI integrations for query building, schema modeling and imports
- Expanded performance insights to catch potential issues early
- More collaboration tools for teams working on shared graph projects
Further Reading
- Memgraph Lab 101: Simplify Graph Data Exploration with Visualization and Querying
- Talking to Your Graph Database with LLMs Using GraphChat
- From Questions to Queries: How to Talk to Your Graph Database With LLMs?
- LLM Throws a Syntax Error Tantrum: Teaching AI to Craft Graph Style Scripts
- Natural Language Querying with Memgraph Lab
- GraphChat docs
- Memgraph’s GraphRAG: Your Shortcut to Personalized GenAI Apps
- How Would Microsoft GraphRAG Work Alongside a Graph Database?