ii researcher

Report Abuse

Basic Information

II-Researcher is an open-source deep search agent designed to perform intelligent web searches, scrape and extract content, and generate comprehensive, referenced answers to user questions. The project combines structured output via BAML with multiple search providers and scrapers to run multi-step retrieval and reflection workflows. It is distributed as a Python package with CLI, a FastAPI backend, a Next.js frontend, Docker Compose orchestration, and optional MCP server integration for desktop agent workflows. The repository documents required environment variables, model configuration and compression options, and instructions to run a local LiteLLM proxy or route models via OpenRouter. The README includes benchmark information using the Frames dataset and practical deployment pathways for local development and containerized production.

Links

App Details

Features
Intelligent web search using configurable providers such as Tavily and SerpAPI. Multiple scraping and extraction backends including Firecrawl, browser-based scraping, and BeautifulSoup. Multi-step reasoning and reflection pipelines with configurable reasoning models. Support for configurable LLMs and embedding models and optional LLM-based context compression. Asynchronous operation and performance timeouts for scalable search tasks. Comprehensive answer generation that includes references and reports. CLI usage, web interface with FastAPI backend and Next.js frontend, and Docker Compose orchestration for frontend, API and LiteLLM services. MCP integration for connecting to desktop agents and examples for running large models via SGLang.
Use Cases
II-Researcher helps researchers, analysts and developers automate deep web searches and synthesize findings into complete, referenced answers. It centralizes search provider selection, scraping, embedding compression and LLM-based reasoning so users can run repeatable, configurable research pipelines locally or in containers. The project supports CLI and web UIs for ad-hoc queries, Docker deployment for reproducible environments, and LiteLLM/OpenRouter integration to use different model backends. MCP and Claude integration options allow embedding the agent in desktop workflows. Benchmark results cited in the README provide an empirical signal of effectiveness and the configuration options allow tuning for latency, model choice and scraping strategies.

Please fill the required fields*