LlamaIndex Agent
Basic Information
This repository implements an Llama Index based Agentic-RAG system designed for PDF question-answering. It demonstrates an intelligent agent that decides between multiple retrieval pipelines to handle different types of user queries, specifically a summarization query engine and a vector query engine. The project includes instructional notebooks that introduce LlamaIndex concepts, step-by-step development of an Agentic-RAG system, and a customization notebook that shows how to perform PDF Q/A using the Phi3 3.8B model. Support code is organized into utils.py and a Gradio application in app.py for a runnable demo. The README lists technologies used such as Gradio for the app, nomic-embed-text for embeddings, and Ollama as a local LLM backend. A separate repository is noted for a Dockerized deployment. The materials and scripts target a Linux environment with guidance for installing dependencies and running the demo.