freshcrate
Home > MCP Servers > zenii

zenii

Your machine's AI brain. One 20MB binary gives every tool, script, and cron job shared AI memory + 114 API routes. Desktop app, CLI, Telegram — all connected. Rust-powered.

Description

Your machine's AI brain. One 20MB binary gives every tool, script, and cron job shared AI memory + 114 API routes. Desktop app, CLI, Telegram — all connected. Rust-powered.

README

Zenii (zen-ee-eye)

Zenii demo

One local AI backend for your scripts, tools, and agents.

Zenii runs a daemon on http://localhost:18981 so your desktop app, CLI, TUI, scripts, and MCP clients all use the same memory, tools, model providers, and permissions.

Latest release CI MIT license PRs welcomeZenii is for developers who want AI to behave like infrastructure instead of a browser tab. Run one local service. Call it from curl, scripts, cron jobs, or an MCP client. Use the same backend from the native desktop app, CLI, or TUI.

curl -fsSL https://raw.githubusercontent.com/sprklai/zenii/main/install.sh | bash
zenii-daemon &

# Store something once
curl -s -X POST http://localhost:18981/memory \
  -H "Content-Type: application/json" \
  -d '{"key":"deploy","content":"Production database is on port 5434"}' >/dev/null

# Ask through chat later
curl -s -X POST http://localhost:18981/chat \
  -H "Content-Type: application/json" \
  -d '{"session_id":"ops","prompt":"What port is the production database on?"}' | jq -r '.response'

That is the core value: write state once, use it from anywhere that talks to Zenii.

What Zenii Is

  • A local daemon with a REST and WebSocket API at localhost:18981
  • A shared AI backend for the desktop app, CLI, TUI, scripts, and MCP clients
  • Persistent memory, provider routing, and tool execution in one local service
  • A native Rust/Tauri stack instead of an Electron wrapper

Good Fit

  • Local automations that need shared memory across scripts, bots, and tools
  • Developer tooling that wants one AI backend behind HTTP or MCP
  • Self-hosted workflows where privacy and local control matter
  • Projects that want a desktop UI and a scriptable backend without maintaining both separately

Current Product Boundaries

  • Zenii is not a hosted SaaS product
  • Zenii is not a drop-in OpenAI-compatible server today
  • Mobile is planned, but not shipped in this repository

What Ships Today

  • zenii-daemon: local API server
  • zenii: CLI client
  • zenii-tui: terminal UI
  • zenii-desktop: Tauri desktop app
  • zenii-mcp-server: MCP server for Claude Code, Cursor, and similar clients
  • 15 base tools, with channels, scheduler, and workflows tools available behind feature flags
  • 114 total API routes: 86 base routes and 28 feature-gated routes
  • MIT license

Install

Use the install script on Linux or macOS:

curl -fsSL https://raw.githubusercontent.com/sprklai/zenii/main/install.sh | bash
zenii-daemon &

Or download platform binaries and desktop packages from GitHub Releases.

Full platform notes, package names, and source builds:

Interfaces

Surface Best for
zenii-daemon Local API server for scripts, automations, and services
zenii Quick prompts, shell pipelines, and terminal workflows
zenii-tui Terminal-native interactive use
zenii-desktop Native desktop UI on top of the same backend
zenii-mcp-server Exposing Zenii tools to external coding agents

MCP Example

Add Zenii to .mcp.json:

{
  "mcpServers": {
    "zenii": {
      "command": "zenii-mcp-server",
      "args": ["--transport", "stdio"]
    }
  }
}

More integration detail lives in AGENT.md.

Docs

Contributing

Small documentation fixes, typo fixes, tests, and focused bug fixes can go straight to a PR. Larger feature work should start with CONTRIBUTING.md.

If Zenii is useful to you, star the repo: https://github.com/sprklai/zenii

License

MIT

Release History

VersionChangesUrgencyDate
app-v0.1.12See assets below to download and install. **Full Changelog**: https://github.com/sprklai/zenii/compare/app-v0.1.11...app-v0.1.12High4/16/2026
app-v0.1.11See assets below to download and install. ## What's Changed * chore(deps): bump rustls from 0.23.37 to 0.23.38 by @dependabot[bot] in https://github.com/sprklai/zenii/pull/54 * chore(deps): bump openssl from 0.10.76 to 0.10.77 by @dependabot[bot] in https://github.com/sprklai/zenii/pull/53 * chore(deps): bump softprops/action-gh-release from 2 to 3 by @dependabot[bot] in https://github.com/sprklai/zenii/pull/50 **Full Changelog**: https://github.com/sprklai/zenii/compare/app-v0.1.10...app-v0.High4/14/2026
app-v0.1.9## What's New ### MCP Server (Model Context Protocol) Zenii now ships a standalone `zenii-mcp-server` binary that exposes all registered tools to **Claude Code, Cursor, Windsurf**, and any MCP-compatible client via stdio transport. **Quick setup for Claude Code** — add to `.mcp.json`: ```json { "mcpServers": { "zenii": { "command": "zenii-mcp-server", "args": ["--transport", "stdio"] } } } ``` ### AGENT.md — AI Agent Integration Guide New `AGENT.md` documents how anHigh4/2/2026

Dependencies & License Audit

Loading dependencies...

Similar Packages

openyak开源牛子 Rust-first local coding-agent CLI with a local /v1/threads server, plugins/skills, and Python/TypeScript SDK alphas.main@2026-04-11
git-mcp-rs🔍 Enable real-time exploration of GitHub repositories with this high-performance Model Context Protocol (MCP) server built in Rust.main@2026-04-21
ORBEXA🛒 Enable AI agents to shop for users seamlessly, protecting your brand and data while streamlining commerce across any platform.main@2026-04-21
kagglerun🚀 Run Python on Kaggle's free GPUs directly from your terminal without the need for a browser, streamlining your data science workflow.master@2026-04-21
Ollama-Terminal-AgentAutomate shell tasks using a local Ollama model that plans, executes, and fixes commands without cloud or API dependencies.main@2026-04-21