Report Abuse

Basic Information

wcgw is an MCP server designed to let chat models, especially Claude via MCP, operate on a local machine by executing shell commands, reading and editing files, running builds and tests, and interacting with a working terminal. The README provides platform-specific installation and configuration steps for macOS, Linux, and Windows WSL using the uv tool or Docker and documents optional local operation with OpenAI or Anthropic API keys. It exposes a set of MCP tools (Initialize, BashCommand, ReadFiles, WriteIfEmpty, FileEdit, ReadImage, ContextSave) for workspace setup, file operations, command execution, and context checkpointing. The project includes operational modes to restrict or grant capabilities (architect, code-writer, wcgw), mechanisms to attach to the AI’s terminal via screen, and explicit warnings and protections to reduce accidental destructive commands. The server targets developer workflows where an LLM needs safe programmatic access to a repository and shell.

Links

Categorization

App Details

Features
The repository bundles tightly integrated shell and code editing capabilities with a comprehensive set of MCP tools. Key features include interactive command handling that supports arrow keys, interrupts and ANSI sequences, single-shell-instance enforcement to prevent concurrent rogue processes, and command polling with adaptive timeouts for prompt feedback. File protections require the AI to read files before editing, chunk very large files to avoid context overload, and perform spacing-tolerant, Aider-like search-and-replace edits with closest-match fallback for correctness. It supports large-file incremental edits, syntax checking on edits with feedback to the model, automatic loading of a CLAUDE.md instruction file, and task checkpointing via ContextSave to save project state for resumption or transfer. Additional features include a multiplexed terminal attach workflow using screen, Docker deployment instructions, an optional VS Code extension for pasting context, and configuration guidance for using uv and uvx.
Use Cases
wcgw helps developers and teams iterate on code and debugging tasks by allowing an LLM to perform create, execute and iterate cycles directly on a local workspace. It streamlines common workflows such as running compilers and tests repeatedly until green, reproducing and fixing failing commands, creating feature branches and PRs, generating and running unit tests, and building and running apps including background servers and emulators. The ContextSave tool enables task checkpointing and knowledge transfer so work can be resumed or shared with another agent. Safety features and modes limit accidental writes and destructive commands, while terminal attachment enables human inspection and intervention. Docker, uvx and API-key modes offer flexible deployment options, making it practical to augment developer workflows with an LLM that has controlled, observable access to the local shell and repository.

Please fill the required fields*