freshcrate

Search results for "llm-inference"

Clear filters
3 results found (Rust)
spiceai📁v2.0.0-rc.3🌳 Mature2,880

A portable accelerated SQL query, search, and LLM-inference engine, written in Rust, for data-grounded AI apps and agents.

llmtrace📁v0.2.0🌱 Seedling46

Zero-code LLM security & observability proxy. Real-time prompt injection detection, PII scanning, and cost control for OpenAI-compatible APIs. Built in Rust.

plano📁0.4.20🌿 Growing6,241

Plano is an AI-native proxy and data plane for agentic apps — with built-in orchestration, safety, observability, and smart LLM routing so you stay focused on your agents core logic.