freshcrate

Search results for "llm-api"

Clear filters
6 results found (Python)
litellmπŸ“v1.83.7-stable🌳 Mature⭐42,951

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropi

SmarterRouterπŸ“2.2.5🌿 Growing⭐105

SmarterRouter: An intelligent LLM gateway and VRAM-aware router for Ollama, llama.cpp, and OpenAI. Features semantic caching, model profiling, and automatic failover for local AI labs.

lm-proxyπŸ“v3.2.2🌱 Seedling⭐111

OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPIβ€”use as library or standalone service.

open-responses-serverπŸ“v0.4.3🌱 Seedling⭐161

Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.

invariant-gatewayπŸ“0.0.0🌱 Seedling⭐69

LLM proxy to observe and debug what your AI agents are doing.

LLM-API-Key-ProxyπŸ“main/build-20260123-1-bf7ab7e🌱 Seedling⭐448

Universal LLM Gateway: One API, every LLM. OpenAI/Anthropic-compatible endpoints with multi-provider translation and intelligent load-balancing.