Search results for "llm-tools"
The best way to create, deploy, and share MCP Servers
Dragon Brain — persistent long-term memory for AI agents via MCP (Model Context Protocol). Knowledge graph (FalkorDB) + vector search (Qdrant) + CUDA GPU embeddings. Works with Claude, Gemini CLI, Cur
MCP server for token-efficient large document analysis via the use of REPL state
Framework for AI agents to build and maintain an Obsidian wiki using Karpathy's LLM Wiki pattern
The Mind Palace for AI Agents — Autonomous Cognitive OS with affect-tagged memory (valence engine), token-economic RL (surprisal gate + UBI), Hebbian learning, ACT-R spreading activation, Synapse Engi
A unified web extraction and stateful automation engine for AI. Replaces heavy testing frameworks with token-optimized browser control, deep research, and HITL.
Notion CLI with AI agent support. Smart queries, Obsidian sync, batch ops, backups, validation and more.
The LLM Anti-Framework
🔍 Enable real-time exploration of GitHub repositories with this high-performance Model Context Protocol (MCP) server built in Rust.
🛡 Enforce security policies, redact data, sandbox processes, and verify integrity for Model Context Protocol (MCP) server communication.
A command-line interface tool for serving LLM using vLLM.
SearXNG tool plugin for https://llm.datasette.io/
