Report Abuse

Basic Information

AI Documentation Generator is a developer tool that automatically analyzes source code repositories and produces structured, human-readable documentation using large language models. It is designed for engineering teams and maintainers who need up-to-date README files and architectural summaries without manual authoring. The system runs as a CLI application and inspects code structure, data flow, dependencies, request flow, and APIs using a multi-agent architecture where specialized agents perform different analyses. Outputs are generated documentation files saved into a .ai/docs directory and can be produced on demand or via scheduled cronjob workflows. The project supports integration with GitLab for automated merge requests and is configurable through YAML plus environment variables so teams can tailor which analyses and sections are included.

Links

App Details

Features
The repository provides multi-agent analysis with specialized agents for code structure, data flow, dependency and API analysis. It offers automated README and documentation generation with configurable sections and the ability to use an existing README as context. GitLab integration is included for automated merge request creation and cronjob support. The tool supports concurrent processing for faster analysis and YAML-based configuration with environment variable overrides. Multiple OpenAI-compatible LLMs are supported, including local models and OpenRouter. Observability features include OpenTelemetry tracing and Langfuse integration. The CLI exposes commands for analyze, document and cronjob and the stack uses Python 3.13, pydantic-ai, GitPython and python-gitlab.
Use Cases
This project reduces the manual effort of creating and maintaining repository documentation by automatically producing comprehensive READMEs and architecture summaries based on code analysis. It helps teams keep docs current through scheduled cronjobs and GitLab workflow integration that can create merge requests with generated content. Configurable analysis options let teams exclude or include code structure, data flow, dependencies, request flow or API analysis, and CLI flags provide quick overrides. Support for multiple LLM providers and local models gives flexibility in choice of language model. Observability integrations help debug and monitor LLM calls and agent behavior. Generated docs are stored in .ai/docs and the tool can be run locally or in automated CI environments.

Please fill the required fields*