Search results for "local-llm"
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-l
ToolAgents is a lightweight and flexible framework for creating function-calling agents with various language models and APIs.
Local-first Agentic Memory Layer for MCP Agents • 25 tools • Hybrid search (FTS5 + vector + MMR) • GDPR • 100% local
Syllabus-aware RAG study assistant for university students. Answers strictly from your own notes & PDFs, unit-scoped retrieval, cross-encoder reranking, and a hallucination gate — built to help studen
🤖 Build intelligent, offline LLM agents with LangGraph and llama-cpp-python using this starter template for local, private tool-calling applications.
