Hawk CLI opened to any LLM — OpenAI, Gemini, Grok, OpenRouter, Ollama, and 200+ models.
- 🤖 Multi-Provider — Works with OpenAI, Hawk, Gemini, DeepSeek, Ollama, and any OpenAI-compatible API
- 🛠️ Full Tool Suite — Bash, File editing, Grep, Glob, WebFetch, Agents, MCP
- 🔄 Streaming — Real-time token streaming
- 📡 OpenAI Shim — Translation layer between Hawk and any LLM API
- 💾 Local Models — Run offline with Ollama or LM Studio
npm install -g hawkbrew install hawkgit clone https://github.com/GrayCodeAI/hawk.git
cd hawk
bun install
bun run build
npm linkbun run profile:init -- --provider openai --api-key sk-your-key --model gpt-4o
hawkHawk stores provider configuration in ~/.hawk/provider.json and loads it on startup, similar to Herm. Environment variables are still supported as explicit overrides.
Provider resolution is provider-scoped (Herm/Langdag style): OpenRouter, Grok/xAI, and Gemini keys are preferred over OPENAI_API_KEY when those providers are configured.
| Provider | Base URL | Notes |
|---|---|---|
| OpenAI | https://api.openai.com/v1 |
Default |
| OpenRouter | https://openrouter.ai/api/v1 |
Stores openrouter_api_key |
| Anthropic (OpenAI-compatible) | https://api.anthropic.com/v1 |
Stores anthropic_api_key |
| Grok / xAI | https://api.x.ai/v1 |
Stores grok_api_key or xai_api_key |
| DeepSeek | https://api.deepseek.com/v1 |
|
| Together AI | https://api.together.xyz/v1 |
|
| Groq | https://api.groq.com/openai/v1 |
Free tier |
| Mistral | https://api.mistral.ai/v1 |
|
| Azure OpenAI | https://*.openai.azure.com/openai/deployments/*/v1 |
|
| Ollama | http://localhost:11434/v1 |
Local, no API key |
| LM Studio | http://localhost:1234/v1 |
Local |
Provider config is stored at ~/.hawk/provider.json.
| Field | Description |
|---|---|
anthropic_api_key |
Anthropic key |
openai_api_key |
OpenAI or OpenAI-compatible key |
openrouter_api_key |
OpenRouter key |
grok_api_key / xai_api_key |
Grok / xAI key |
gemini_api_key |
Gemini key |
ollama_base_url |
Ollama host, for example http://localhost:11434 |
active_model |
Default model |
If multiple providers are configured, Hawk uses this priority: Anthropic, OpenAI, OpenRouter, Grok, Gemini, Ollama.
Hawk model lists are dynamic and provider-scoped:
/refresh-model-catalogrefreshes the local cache at~/.hawk/model_catalog.json./debug-model-catalogshows source, timestamp, and per-provider counts.- OpenRouter model catalog entries are fetched live when an OpenRouter key is configured.
bun run profile:init -- --provider openai --api-key sk-... --model gpt-4o
hawkbun run profile:anthropic -- --api-key sk-ant-... --model claude-3-5-sonnet-latest
hawkbun run profile:grok -- --api-key xai-... --model grok-2
hawkexport HAWK_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-...
export OPENAI_BASE_URL=https://api.deepseek.com/v1
export OPENAI_MODEL=deepseek-chatollama pull llama3.2
bun run profile:init -- --provider ollama --model llama3.2 --base-url http://localhost:11434
hawk| Command | Description |
|---|---|
hawk |
Start the CLI |
hawk --version |
Show version |
hawk --help |
Show help |
# Install dependencies
bun install
# Build
bun run build
# Run in development
bun run dev
# Validate environment
bun run doctor:runtime
# Quick sanity check
bun run smokeContributions are welcome! Please read our contributing guide first.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
See SECURITY.md for our security policy.
MIT License - see LICENSE for details.

