Report Abuse

Basic Information

gibber-mcp is a lightweight Model Context Protocol (MCP) server implemented with Express.js that provides cryptographic primitives for secure communication between AI models and tools. It is intended as an infrastructure component for developers building MCP-compatible agents or integrating tool use into LLM-based workflows. The server exposes endpoints for server-sent events and message posting, and it bundles SJCL-based cryptography utilities so models can generate P-256 key pairs, derive shared secrets, and perform AES-CCM encryption and decryption during runtime. The README includes a concrete Sonnet 3.7 LLM thread demonstrating key exchange, encryption, and decryption flows. The project is designed to be run locally or in production with standard npm scripts and a configurable PORT environment variable.

Links

App Details

Features
The repository implements a set of focused cryptographic tools and MCP server features. It can generate SJCL P-256 key pairs, derive shared secrets from private and public keys, encrypt plaintext using SJCL AES-CCM with a derived shared secret, and decrypt AES-CCM ciphertext. The server supports real-time streams via a GET /sse endpoint and accepts messages over POST /messages/:id to address specific connections. It is powered by the Stanford Javascript Crypto Library and includes npm scripts for development, build, and production start. Environment configuration is minimal, primarily a PORT variable, making it straightforward to integrate into existing MCP deployments or testing environments.
Use Cases
This project helps developers add end-to-end cryptographic tooling to MCP-based agent systems so language models can negotiate keys and exchange encrypted payloads as part of tool use. By exposing generateKeyPair, deriveSharedSecret, encrypt, and decrypt operations over an Express MCP server, it enables agents to perform key exchange and secure messaging during conversational flows, reducing the risk of interception or plain-text leakage. Server-sent events permit streaming updates to clients or models and the POST message endpoint supports directed communications. The provided Sonnet LLM example illustrates how an LLM thread can call the server tools to securely exchange a secret and verify encryption, offering a reproducible reference for integration and testing.

Please fill the required fields*