Tag: #llm-tools
10 packages • ⭐ 3,510 total stars
The LLM Anti-Framework
The best way to create, deploy, and share MCP Servers
A command-line interface tool for serving LLM using vLLM.
Framework for AI agents to build and maintain an Obsidian wiki using Karpathy's LLM Wiki pattern
MCP server for token-efficient large document analysis via the use of REPL state
The Mind Palace for AI Agents — Autonomous Cognitive OS with affect-tagged memory (valence engine), token-economic RL (surprisal gate + UBI), Hebbian learning, ACT-R spreading activation, Synapse Engi
Notion CLI with AI agent support. Smart queries, Obsidian sync, batch ops, backups, validation and more.
A unified web extraction and stateful automation engine for AI. Replaces heavy testing frameworks with token-optimized browser control, deep research, and HITL.
Dragon Brain — persistent long-term memory for AI agents via MCP (Model Context Protocol). Knowledge graph (FalkorDB) + vector search (Qdrant) + CUDA GPU embeddings. Works with Claude, Gemini CLI, Cur
🔍 Enable real-time exploration of GitHub repositories with this high-performance Model Context Protocol (MCP) server built in Rust.
