Search results for "onnx"
⚡ Lightweight offline AI agent for local models. No cloud, no API keys — just your GPU.
Your AI assistant that never forgets and runs 100% privately on your computer. Leave it on 24/7 - it learns your preferences, helps with code, manages your health goals, searches the web, and connects
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
The leading, most token-efficient MCP server for GitHub source code exploration via tree-sitter AST parsing
Agentic memory for CTI in Python — STIX knowledge graphs, threat-actor alias resolution, offline-first RAG, MCP server for Claude Code and LangChain agents
Open-source multi-agent AI assistant powered by LangGraph, FastAPI & Next.js — 16+ agents, Human-in-the-Loop, MCP integration, voice TTS, RAG, 500+ metrics, 6 languages.
Unified framework for building enterprise RAG pipelines with small, specialized models
A Multi-Agentic AI Assistant/Builder
🚀 2026 最系统的 AI Agent 速成指南|智能体实战教程 · 完整学习路径 + 实战项目 + 面试题库 · 对标大模型应用开发工程师岗位 · 覆盖LangChain / LangGraph / Coze / Dify / MCP / skills / LLM / RAG / 提示词 · 企业级部署与微调 · 从0到企业级落地 + 从学习到上线项目 + 面试准备一体化
Lightweight semantic code search engine — 2-stage vector + FTS + RRF fusion + MCP server for Claude Code
Search your files by talking to them - 100% offline
Local AI server with persistent memory, RAG, and multi-backend inference (MLX / llama.cpp / Ollama). Runs entirely on your machine — zero data sent to external services.
The highest-scoring AI memory system ever benchmarked that isn't reliant on LLM reranking. And it's free & burns less tokens.
Client library for the Qdrant vector search engine
