Search results for "cuda"
Local knowledge graph for AI agents. Hybrid search + MCP server for Obsidian vaults.
Every meeting, every idea, every voice note — searchable by your AI. Open-source, privacy-first conversation memory layer.
A portable accelerated SQL query, search, and LLM-inference engine, written in Rust, for data-grounded AI apps and agents.
Distributed AI/LLM for the people. Share compute privately or publicly to power your agents and chat.
NextPlaid, ColGREP: Multi-vector search, from database to coding agents.
OramaCore is the complete runtime you need for your projects, answer engines, copilots, and search. It includes a fully-fledged full-text search engine, vector database, LLM interface, and many more u
Self-hosted AI coding assistant
A high-performance, in-memory vector database written in Rust, designed for semantic search and top-k nearest neighbor queries in AI-driven applications, with binary file persistence for durability.
One CLI. Every debugger. Give your AI agent eyes into runtime state instead of guessing from source code.
