gemini fullstack langgraph quickstart
Basic Information
This repository is a quickstart fullstack example that demonstrates how to build a research-augmented conversational agent using a React frontend and a LangGraph-powered backend with Google Gemini models. It shows a complete developer workflow including a frontend built with Vite and Tailwind, a LangGraph/FastAPI backend that implements an automated research agent, and examples for running the agent from the command line. The agent workflow implemented in backend/src/agent/graph.py generates search queries, performs web research via the Google Search API, reflects on results to identify knowledge gaps, iteratively refines searches, and synthesizes final answers with citations. The project includes development conveniences such as hot-reloading and a CLI example and documents production deployment requirements like Redis and Postgres for LangGraph and environment variables such as GEMINI_API_KEY.