cognee
Basic Information
Cognee implements scalable, modular ECL (Extract, Cognify, Load) pipelines that allow you to interconnect and retrieve past conversations, documents, and audio transcriptions while reducing hallucinations, developer effort, and cost.
Visuals
Links
Access
Open Source
Stars
286
Twitter
App URL
Github Repository
App Details
Features
Modular: Cognee is modular by nature, using tasks grouped into pipelines, Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI., Vector Stores: Cognee supports LanceDB, Qdrant, PGVector and Weaviate for vector storage., Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider., Graph Stores: In addition to NetworkX, Neo4j is also supported for graph storage.
Use Cases
Memory for AI Agents, Ontology definition, Entity resolution, Chatbot memory
Pricing
Pricing Model
Free
Developer Information
Developer Email