Autogen_GraphRAG_Ollama
Basic Information
This repository provides an integrated reference implementation that combines GraphRAG, AutoGen agents, local Ollama LLMs, and Chainlit UI to build a fully local, multi-agent Retrieval-Augmented Generation (RAG) superbot. It demonstrates how to wire GraphRAG's knowledge search with AutoGen agent workflows using function calling, how to configure GraphRAG for both local and global search, and how to run inference and embeddings on offline Ollama models. The project includes practical setup steps for Linux and Windows such as installing Ollama models, creating a Python environment, initializing a GraphRAG root folder, replacing GraphRAG embedding modules with supplied utility files, generating embeddings and a knowledge graph, starting a Lite-LLM proxy server, and launching the Chainlit interactive UI. The repo is aimed at developers who want a reproducible, local RAG agent stack that avoids external OpenAI dependencies.