Report Abuse

Basic Information

Agent Spotlight is a desktop AI agent that brings large language model capabilities directly to your workstation so you can interact with files, run commands, and integrate external tools using natural language. The repository provides a cross-platform application built with Tauri and Rust for native performance, and a Next.js/React frontend styled with Tailwind. It connects to Google's Gemini model for intelligent query handling and uses the Model Context Protocol to discover and call external tool servers. The project is focused on minimal, fast UI accessible with a global hotkey and on local privacy by keeping API keys and data on the machine. The README includes quick start instructions, prerequisites such as Node.js and Rust and a Google AI API key, plus examples and a configuration approach for adding MCP tool servers.

Links

Categorization

App Details

Features
Agent Spotlight exposes a Spotlight-style interface with a global hotkey for instant access and uses Gemini 2.5 Flash for AI reasoning and function calling so the model can decide when to invoke tools. Core features include MCP integration to add arbitrary tool servers, a filesystem tool for browsing and searching files, cross-platform support for Windows, macOS and Linux, and a Rust backend for low-latency execution. The frontend stack is Next.js, React and Tailwind, and the app supports smart persistence of history and settings locally for privacy. Additional features listed are visual real-time status updates while tools execute, hot reload during development, and extensibility via a mcp_servers.json config to register filesystem, git, database or custom servers.
Use Cases
The project helps users turn their desktop into an AI-powered command center that reduces context switching and speeds common tasks like searching code, listing files, finding TODOs, analyzing file sizes or inspecting system storage. By exposing system tools, APIs and data sources to the model via MCP, Agent Spotlight can automate developer workflows, run shell commands, query databases, and integrate services such as Git, Docker or cloud tools where configured. Local storage of API keys and conversation history preserves privacy while the model synthesizes tool results into actionable answers. The extensibility model allows power users and teams to add custom tool servers, enabling automation, richer integrations, and future capabilities like chained workflows and a plugin ecosystem described on the roadmap.

Please fill the required fields*