🎯 Key points for quick navigation: 00:00 *📊 Introducing GraphRAG visualization in Neo4j with GroK integration, promising improved AI response quality.* 00:15 *📚 The video will cover GraphRAG prompts, Gro integration, and local Neo4j setup.* 00:54 *🔍 Entity extraction in GraphRAG enhances response quality over basic RAG through detailed data relationships.* 01:22 *📝 Entity extraction prompt identifies entities, types, descriptions, relationships, and their strength.* 01:48 *🌐 Community report prompt aggregates data into a comprehensive report, detailing impacts and insights.* 02:44 *🚀 Tutorial includes GroK integration and Neo4j setup for visualization.* 02:59 *📈 First step: Install GraphRAG and set up Gro API key; use the Mixl model for better token handling.* 04:17 *🛠️ Adjust settings, especially for embeddings using OpenAI API, and prepare GroK for local use.* 04:59 *📂 Create an input folder and prepare content (e.g., "A Christmas Carol") for GraphRAG processing.* 05:28 *🔍 Use the local search query method to explore extracted entities and relationships in the text.* 05:42 *💻 Set up Neo4j locally based on OS-specific instructions; begin with MacOS example.* 06:11 *🌐 Start Neo4j, navigate to the interface, and connect with default credentials; set up personalized password.* 06:37 *📑 Convert GraphRAG output to CSV using provided script, prepare files for Neo4j import.* 07:18 *📤 Import data into Neo4j and address existing index issues if necessary.* 08:00 *🔍 Use Neo4j commands to query and visualize data relationships; utilize resources like ChatGPT for command generation if needed.* Made with HARPA AI
Hi Mervin, Your videos are inspirational and informative as always. I've few questions. Can you please shed light on how can we insert more documents in existing Graph after creating it. While preventing duplicate entries. Is GraphRAG appropriate for conversational chat history for long term memory of an AI companion? And if not then what's best route for storing messages for long term memory? It's a collage project which is going to use all open source tech which can run locally on our single 4090
Really nice and fast video. Love the combination of GraphRAG + Groq + Neo4j. I think it would be good to use MERGE instead of CREATE in your load statements so they are idempotent and don't create new nodes or new relationships on each run. For the indexes and constraints you can add "IF NOT EXISTS" so it doesn't fail.
Hi Mervin, can you implement this inside of obsidian notes? How will we go about using inside of Obsidian and still maintain the ability of the LLM, I'm mainly using local LLM Models.
Isn't it easier to just generate graphml files during indexing and importing the graphml files? I have not used Neo4j graphml import but works fine in Gephi.
Hi, I have a question about the "local" and "global search" functions in GraphRAG. What are the differences between them, considering that the data source is the books.txt file?
Global Search uses whole dataset reasoning with LLM-generated community reports from the graph hierarchy. It employs a map-reduce approach and is suitable for holistic questions about the entire dataset, like identifying main themes. This method requires extensive configuration parameters, including map-reduce prompts and general knowledge inclusion. Local Search focuses on entity-based reasoning using structured data from the knowledge graph and unstructured data from input documents. It is ideal for questions about specific entities and their properties. This approach uses entities and their information and has a simpler set of configuration parameters.
@@MervinPraisonso could I think of it like this: global search - here are twenty documents and what are the overarching themes? Local search - what did John say to same in document 2 about x topic? (Ie much more specific)
@@nexuslux the global search takes all community summaries, sends them packed together to the LLM (multiple calls) and generates prototype answers and then combines them to a final answer. Quite expensive in terms of tokens. Would be better to rank and select only the top-relevant communities (either by embeddings or by entities contained)
@@nexuslux Local search starts with the entity embeddings, finds top-k (10), and then includes relationships between them and outwards from the set (limited), then adds text-units, communiy reports, claims and findings for these entities until the token-space is filled.
Integrating MS’s graphRAG with Neo4j takes this to the next level, well done Mervin
🎯 Key points for quick navigation:
00:00 *📊 Introducing GraphRAG visualization in Neo4j with GroK integration, promising improved AI response quality.*
00:15 *📚 The video will cover GraphRAG prompts, Gro integration, and local Neo4j setup.*
00:54 *🔍 Entity extraction in GraphRAG enhances response quality over basic RAG through detailed data relationships.*
01:22 *📝 Entity extraction prompt identifies entities, types, descriptions, relationships, and their strength.*
01:48 *🌐 Community report prompt aggregates data into a comprehensive report, detailing impacts and insights.*
02:44 *🚀 Tutorial includes GroK integration and Neo4j setup for visualization.*
02:59 *📈 First step: Install GraphRAG and set up Gro API key; use the Mixl model for better token handling.*
04:17 *🛠️ Adjust settings, especially for embeddings using OpenAI API, and prepare GroK for local use.*
04:59 *📂 Create an input folder and prepare content (e.g., "A Christmas Carol") for GraphRAG processing.*
05:28 *🔍 Use the local search query method to explore extracted entities and relationships in the text.*
05:42 *💻 Set up Neo4j locally based on OS-specific instructions; begin with MacOS example.*
06:11 *🌐 Start Neo4j, navigate to the interface, and connect with default credentials; set up personalized password.*
06:37 *📑 Convert GraphRAG output to CSV using provided script, prepare files for Neo4j import.*
07:18 *📤 Import data into Neo4j and address existing index issues if necessary.*
08:00 *🔍 Use Neo4j commands to query and visualize data relationships; utilize resources like ChatGPT for command generation if needed.*
Made with HARPA AI
great! very helpful, looking forward for more such videos😇 thanks
Thank you for answering my comment from before! Amazing.
Hi Mervin, Your videos are inspirational and informative as always. I've few questions.
Can you please shed light on how can we insert more documents in existing Graph after creating it. While preventing duplicate entries.
Is GraphRAG appropriate for conversational chat history for long term memory of an AI companion?
And if not then what's best route for storing messages for long term memory?
It's a collage project which is going to use all open source tech which can run locally on our single 4090
Really nice and fast video. Love the combination of GraphRAG + Groq + Neo4j. I think it would be good to use MERGE instead of CREATE in your load statements so they are idempotent and don't create new nodes or new relationships on each run.
For the indexes and constraints you can add "IF NOT EXISTS" so it doesn't fail.
Openai is super expensive to do this kind of indexing of knowledge graphs. Would almost certainly need to be done locally.
Very cool! Where can I find the prompts you showed in the video?
Great content, thanks. Once you build the graph why do you use the LLM for the communities detection instead of a built-in algo on neo4j?
Hi Mervin, can you implement this inside of obsidian notes? How will we go about using inside of Obsidian and still maintain the ability of the LLM, I'm mainly using local LLM Models.
Isn't it easier to just generate graphml files during indexing and importing the graphml files? I have not used Neo4j graphml import but works fine in Gephi.
Can you please demo with Gemini api for graphrag?
Can it be done 100% totally locally using Ollama. Can you tell complete setup in a single video. Thanks
If I want the "source data" to be in portuguese, would I have to change all the prompts to portuguese as well?
Again an informative video👍
are there steps that by default use cpu when they could be gpu?
Hi, I have a question about the "local" and "global search" functions in GraphRAG. What are the differences between them, considering that the data source is the books.txt file?
Global Search uses whole dataset reasoning with LLM-generated community reports from the graph hierarchy. It employs a map-reduce approach and is suitable for holistic questions about the entire dataset, like identifying main themes. This method requires extensive configuration parameters, including map-reduce prompts and general knowledge inclusion.
Local Search focuses on entity-based reasoning using structured data from the knowledge graph and unstructured data from input documents. It is ideal for questions about specific entities and their properties. This approach uses entities and their information and has a simpler set of configuration parameters.
mer.vin/2024/07/graphrag-neo4j/
@@MervinPraisonso could I think of it like this: global search - here are twenty documents and what are the overarching themes? Local search - what did John say to same in document 2 about x topic? (Ie much more specific)
@@nexuslux the global search takes all community summaries, sends them packed together to the LLM (multiple calls) and generates prototype answers and then combines them to a final answer. Quite expensive in terms of tokens. Would be better to rank and select only the top-relevant communities (either by embeddings or by entities contained)
@@nexuslux Local search starts with the entity embeddings, finds top-k (10), and then includes relationships between them and outwards from the set (limited), then adds text-units, communiy reports, claims and findings for these entities until the token-space is filled.
Can we transform structure data also using GraphRAG?
You can directly make the files in csv format while generating indexes
this mouse is neat, what is your recording software
i have neo4j running but the python import of neo4j fails. :(
can you tell how to show source documents ??
what if I want to use my own UI for neo4j visualisations, is there a way to render that in a custom UI?