Report Abuse

Basic Information

MCP-Use is an open source Python SDK and infrastructure for connecting language models to Model Context Protocol (MCP) servers and for building custom agents that can call external tools. The project provides client classes, an agent implementation, and adapters to integrate LangChain-enabled LLMs that support tool or function calling. It supports multiple transport types including stdio, SSE, and streamable HTTP, and can connect to local or remote MCP servers via configuration dictionaries or files. The README includes quick start instructions, example use cases such as web browsing with Playwright, Airbnb searches, and Blender 3D tasks, and shows how to install providers and run examples. The library also documents sandboxed execution using a cloud sandbox provider and options for creating sessions and calling tools programmatically.

Links

Categorization

App Details

Features
The repository emphasizes ease of use and rapid setup, claiming agents can be created with a few lines of code. It supports any LangChain-compatible LLM with tool calling and includes a LangChain adapter to generate tools automatically. Features include multi-server support, dynamic server selection via a Server Manager, HTTP connection support, asynchronous streaming of agent output via an astream API, direct programmatic tool calls without an LLM, configurable tool access control to restrict dangerous tools, and sandboxed execution via an E2B integration. The project also provides debugging modes, example configuration files, a code builder, and multiple examples demonstrating integrations with Playwright, Airbnb, and Blender.
Use Cases
MCP-Use helps developers and teams orchestrate complex multi-tool workflows by providing a unified client and agent layer that exposes MCP server tools to LLMs. It makes it straightforward to combine capabilities from different servers in a single agent, select or restrict tools for safety, and stream intermediate results for responsive UIs or logging. The sandboxed execution option reduces local dependency and environment setup by running servers in isolated cloud sandboxes. Direct tool call APIs enable programmatic access when an LLM is unnecessary. Documentation, examples, and LangChain integration lower the barrier to prototype agents that perform web browsing, search, 3D modeling, or other automated tasks across MCP servers.

Please fill the required fields*