freshcrate
Home > Databases > orbit

orbit

One API for 20+ LLM providers, your databases, and your files — self-hosted, open-source AI gateway with RAG, voice, and guardrails.

Description

One API for 20+ LLM providers, your databases, and your files — self-hosted, open-source AI gateway with RAG, voice, and guardrails.

README

ORBIT Logo

ORBIT — Open Retrieval-Based Inference Toolkit

One API for 20+ LLM providers, your databases, and your files.


License Python Latest release Last commit GitHub stars Live Sandbox  |  API Reference  |  Docker Guide  |  Cookbook


Get running in 60 seconds

git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d

Then test it:

curl -X POST http://localhost:3000/v1/chat \
  -H 'Content-Type: application/json' \
  -H 'X-API-Key: default-key' \
  -H 'X-Session-ID: local-test' \
  -d '{
    "messages": [{"role": "user", "content": "Summarize ORBIT in one sentence."}],
    "stream": false
  }'

That's it. ORBIT is listening on port 3000 with an admin panel at localhost:3000/admin (default login: admin / admin123).

For GPU acceleration: docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d


Try it live

The public sandbox hosts one chat workspace per adapter. Pick a demo to see ORBIT in action.

Demo Data Source Try it
Simple Chat LLM simple-chat
Multimodal Chat LLM + Files chat-with-files
Multi-Source Chat SQLite + PostgreSQL + DuckDB composite-multi-source-explorer
Customer Orders PostgreSQL intent-sql-postgres
HR Database SQLite intent-sql-sqlite-hr
DuckDB Analytics DuckDB intent-duckdb-analytics
EV Population Stats DuckDB intent-duckdb-ev-population
JSONPlaceholder REST API HTTP (JSON) intent-http-jsonplaceholder
Paris Open Data API HTTP (JSON) intent-http-paris-opendata
MFlix Sample Collection MongoDB intent-mongodb-mflix
SpaceX GraphQL GraphQL intent-graphql-spacex

Adapter wiring and sample domains live in config/adapters/ and examples/intent-templates/.


What can you build with ORBIT?

  • Ask your database questions in any language — Connect Postgres, MySQL, MongoDB, DuckDB, or Elasticsearch and query them with natural language. Built-in language detection responds in the user's language automatically.
  • Switch LLM providers without changing code — Swap between OpenAI, Anthropic, Gemini, Groq, Ollama, vLLM, and more with a config change.
  • Build voice agents — Full-duplex speech-to-speech with interruption handling via PersonaPlex.
  • Power agentic workflows — MCP-compatible, so AI agents can use ORBIT as a tool.
  • Upload files and get answers — RAG over PDFs, images, and documents out of the box.
  • Add guardrails and content moderation — Built-in safety layer with OpenAI, Anthropic, or local (Llama Guard) moderators to filter harmful content before it reaches users.
  • Go from text to speech and back — Plug in STT (Whisper, Google, Gemini) and TTS (OpenAI, ElevenLabs, Coqui) providers for voice-enabled applications.
  • Keep everything private — Self-host on your own infrastructure with RBAC, rate limiting, and audit logging.

Supported integrations

LLM Providers: OpenAI, Anthropic, Google Gemini, Cohere, Groq, DeepSeek, Mistral, AWS Bedrock, Azure, Together, Ollama, vLLM, llama.cpp

Data Sources: PostgreSQL, MySQL, MongoDB, Elasticsearch, DuckDB, SQLite, HTTP/REST APIs, GraphQL

Vector Stores: Chroma, Qdrant, Pinecone, Milvus, Weaviate


Why ORBIT?

Without ORBIT With ORBIT
One SDK per provider, rewrites when you switch One OpenAI-compatible API across all providers
Separate pipelines for retrieval and inference Unified model + retrieval + tooling gateway
Fragile glue scripts between data sources and LLMs Production-ready connectors with policy controls
No visibility into what models are doing Built-in RBAC, rate limiting, and audit logging

Built with ORBIT

  • PoliceStats.ca — Explore Canadian Crime Statistics with AI. Ask about auto theft, break-ins, neighbourhood crime patterns, and cross-city comparisons, with answers grounded in source data citations.

Using ORBIT in production? Let us know and we'll add your project here.


Clients

Client Description
Web Chat React UI
CLI pip install schmitech-orbit-client
Mobile iOS & Android (Expo)
Node SDK Or use any OpenAI-compatible SDK

Deployment options

Docker Compose (fastest path)
git clone https://github.com/schmitech/orbit.git && cd orbit/docker
docker compose up -d

Starts ORBIT + Ollama with SmolLM2, auto-pulls models, and exposes the API on port 3000. The web admin UI is at /admin on the same host. Connect orbitchat from your host:

ORBIT_ADAPTER_KEYS='{"simple-chat":"default-key"}' npx orbitchat

See the full Docker Guide for GPU mode, volumes, and configuration.

Pre-built image (server only)
docker pull schmitech/orbit:basic
docker run -d --name orbit-basic -p 3000:3000 schmitech/orbit:basic

If Ollama runs on your host, add -e OLLAMA_HOST=host.docker.internal:11434 so the container can reach it. Includes simple-chat only.

From release tarball (production)
curl -L https://github.com/schmitech/orbit/releases/download/v2.6.5/orbit-2.6.5.tar.gz -o orbit-2.6.5.tar.gz
tar -xzf orbit-2.6.5.tar.gz && cd orbit-2.6.5

cp env.example .env && ./install/setup.sh
source venv/bin/activate
./bin/orbit.sh start && cat ./logs/orbit.log

Resources


Contributing

Contributions are welcome! Check the issues for good first tasks, or open a new one to discuss your idea.

If you find ORBIT useful, a star helps others discover the project.


License

Apache 2.0 — see LICENSE.

Release History

VersionChangesUrgencyDate
v2.6.6## [2.6.6] - 2026-04-19 ### Core System Updates - Composite adapter: Enabled composite adapter end-to-end, reorganized prompt examples, and added composite cross-adapter template hot reload so the admin reload flow rebuilds cross-adapter template embeddings and vector collections (with tests for the reload and disabled/no-op paths) - Cross-adapter templates: Added cross-domain intent templates layered on top of individual adapters - OpenAI Realtime voice: Added `openai_realtime` adapter (`High4/19/2026
v2.6.5## [2.6.5] - 2026-04-06 ### Core System Updates - PostgreSQL: Migrated from `psycopg2-binary` to `psycopg[binary,pool]` 3.3.3 across datasources, retrievers, vector stores, and examples (`ConnectionPool`, `dict_row`, `row_factory`); isolated the postgres sample adapter into its own module; fixed customer-order SQL templates and sql-intent parsing; added customer-orders sample data utilities - MCP: Replaced unmaintained `fastapi-mcp` with `fastmcp` (`FastMCP.from_fastapi()` + mount); `/mcp` High4/6/2026

Dependencies & License Audit

Loading dependencies...

Similar Packages

litellmPython SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropiv1.83.7-stable
rag-chatbotRAG (Retrieval-augmented generation) ChatBot that provides answers based on contextual information extracted from a collection of Markdown files.main@2026-04-14
ai-real-estate-assistantAdvanced AI Real Estate Assistant using RAG, LLMs, and Python. Features market analysis, property valuation, and intelligent search.dev@2026-04-13
carapaceA secure, stable Rust alternative to openclaw/moltbot/clawdbotv0.7.0
redis-vl-pythonRedis Vector Library (RedisVL) -- the AI-native Python client for Redis.v0.17.1