Search results for "stream"
Framework for AI Backend. Build and run AI agents like microservices - scalable, observable, and identity-aware from day one.
Mattermost Agents plugin supporting multiple LLMs
eBPF-based GPU causal observability agent
Agentic framework | Self-improving memory | Pluggable tool extensions | Sandbox execution
The ultimate LLM/AI application development framework in Go.
#1 Terminal Benchmark 2.0 — AI that ships your tickets.
A unified AI model hub for aggregation & distribution. It supports cross-converting various LLMs into OpenAI-compatible, Claude-compatible, or Gemini-compatible formats. A centralized gateway for pers
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
Kelos - The Kubernetes-native framework for orchestrating autonomous AI coding agents.
LLM-powered framework for deep document understanding, semantic retrieval, and context-aware answers using RAG paradigm.
Generate OpenAPI 3.1 specs from Go source code via static analysis — zero annotations, automatic framework detection
One API for 25+ LLMs, OpenAI, Anthropic, Bedrock, Azure. Caching, guardrails & cost controls. Go-native LiteLLM & Kong AI Gateway alternative.
An open-source, cloud-native, high-performance gateway unifying multiple LLM providers, from local solutions like Ollama to major cloud providers such as OpenAI, Groq, Cohere, Anthropic, Cloudflare an
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
Model Context Protocol (MCP) server for Kubernetes and OpenShift
Zero trust LLM gateway. OpenAI-compatible proxy with semantic routing and load balancing across OpenAI, Anthropic, Ollama, vLLM, and any compatible backend. Identity-based access, virtual A
The Maestro App Factory: a highly-opinionated multi-agent orchestration tool for app development that emulates the workflow of high-functioning human development teams using AI agents
Zero-dependency Web Application Firewall in Go. Single binary. Three deployment modes. Tokenizer-based detection.
A minimal, lightweight structured data store designed for small applications, scripts and automation workflows. Built for simplicity, portability and low overhead.
Multi-LLM agent orchestration TUI — parallel Claude/Gemini/Codex sessions, 126 MCP tools
Self-hosted AI workflow orchestration server. Runs multi-phase LLM pipelines (Director → Architect → Implementer → QA) and delivers structured artifacts via PR, webhook, or bundle.
