Search results for "anthropic"
β‘οΈ Open-source AI Gateway β Use any SDK to call 100+ LLMs. Built-in failover, load balancing, cost control & end-to-end tracing.
Framework for AI Backend. Build and run AI agents like microservices - scalable, observable, and identity-aware from day one.
β‘οΈAI Cloud OS: Open-source enterprise-level AI knowledge base and MCP (model-context-protocol)/A2A (agent-to-agent) management platform with admin UI, user management and Single-Sign-Onβ‘οΈ, supports Ch
Run a fleet of AI agents on Kubernetes. Administer your cluster agentically
#1 Terminal Benchmark 2.0 β AI that ships your tickets.
βΎοΈ Private Agent Fleet with Spec Coding. Each agent gets their own GPU-accelerated desktop. Run Claude, Codex, Gemini and open models on a full private AI Stack βΎοΈ
The Maestro App Factory: a highly-opinionated multi-agent orchestration tool for app development that emulates the workflow of high-functioning human development teams using AI agents
Kelos - The Kubernetes-native framework for orchestrating autonomous AI coding agents.
One API for 25+ LLMs, OpenAI, Anthropic, Bedrock, Azure. Caching, guardrails & cost controls. Go-native LiteLLM & Kong AI Gateway alternative.
Container-free, deny-by-default sandbox for AI coding agents. Kernel-enforced filesystem, network, and syscall isolation for Linux and macOS
mkdir beats vector DB. B-tree NeuronFS: 0-byte folders govern AI β β©0 infrastructure, ~200x token efficiency. OS-native constraint engine for LLM agents.
A community driven registry service for Model Context Protocol (MCP) servers.
An open-source, cloud-native, high-performance gateway unifying multiple LLM providers, from local solutions like Ollama to major cloud providers such as OpenAI, Groq, Cohere, Anthropic, Cloudflare an
Open-source Agentic AI framework in Go for building, orchestrating, and deploying intelligent agents. LLM-agnostic, event-driven, with multi-agent workflows, MCP tool discovery, and production-grade o
Open-source, self-improving autonomous agent swarmπ
β‘οΈ Blazing fast LLMs API Gateway written in Go
Zero-dependency Web Application Firewall in Go. Single binary. Three deployment modes. Tokenizer-based detection.
LocalAI is the open-source AI engine. Run any model - LLMs, vision, voice, image, video - on any hardware. No GPU required.
The cognitive database. A new class of data storage. Not a vector store, not a graph DB, not a RAG wrapper. Ebbinghaus decay, Hebbian learning, and Bayesian confidence are engine-native primitives.
Open-source AI coding agent. Desktop app, bring your own model. Writes code, browses the web, verifies its work. Apache 2.0.
An open-source AI coding agent that lives in your terminal. Multi-provider, multi-channel, persistent sessions with git-like branching.
Run AI coding agents in hardened container sandboxes.
Mattermost Agents plugin supporting multiple LLMs
Autonomous local AI assistant in Go β 40+ tools, 20+ LLM providers, multi-agent orchestration, self-improving
A Slack bot and MCP client acts as a bridge between Slack and Model Context Protocol (MCP) servers. Using Slack as the interface, it enables large language models (LLMs) to connect and interact with v
