Soul-driven AI agent with permission-hardened tools, token budgets, and multi-channel access.
Runs 24/7 from CLI or Telegram. 21 built-in tools. Extensible skills. Asks before it acts.
npx @cosmicstack/mercury-agentOr install globally:
npm i -g @cosmicstack/mercury-agent
mercuryFirst run triggers the setup wizard โ enter your name, an API key, and optionally a Telegram bot token. Takes 30 seconds.
To reconfigure later (change keys, name, settings):
mercury doctorEvery AI agent can read files, run commands, and fetch URLs. Most do it silently. Mercury asks first.
- Permission-hardened โ Shell blocklist (
sudo,rm -rf /, etc. never execute). Folder-level read/write scoping. Pending approval flow. Skill elevation with granularallowed-tools. No surprises. - Soul-driven โ Personality defined by markdown files you own (
soul.md,persona.md,taste.md,heartbeat.md). No corporate wrapper. - Token-aware โ Daily budget enforcement. Auto-concise when over 70%.
/budgetcommand to check, reset, or override. - Multi-channel โ CLI with real-time streaming. Telegram with HTML formatting, file uploads, and typing indicators.
- Always on โ Run as a background daemon on any OS. Auto-restarts on crash. Starts on boot. Cron scheduling, heartbeat monitoring, and proactive notifications.
- Extensible โ Install community skills with a single command. Schedule skills as recurring tasks. Based on the Agent Skills specification.
One command to make Mercury persistent:
mercury upThis installs the system service (if not installed), starts the background daemon, and ensures Mercury is running. Use this as your go-to command.
If Mercury is already running, mercury up just confirms it and shows the PID.
mercury restart # Restart the background process
mercury stop # Stop the background process
mercury start -d # Start in background (without service install)
mercury logs # View recent daemon logs
mercury status # Show if daemon is runningDaemon mode includes built-in crash recovery โ if the process crashes, it restarts automatically with exponential backoff (up to 10 restarts per minute).
mercury up installs this automatically. You can also manage it directly:
mercury service install| Platform | Method | Requires Admin |
|---|---|---|
| macOS | LaunchAgent (~/Library/LaunchAgents/) |
No |
| Linux | systemd user unit (~/.config/systemd/user/) |
No (linger for boot) |
| Windows | Task Scheduler (schtasks) |
No |
mercury service status # Check if service is running
mercury service uninstall # Remove the system serviceIn daemon mode, Telegram becomes your primary channel โ CLI is log-only since there's no terminal for input.
| Command | Description |
|---|---|
mercury up |
Recommended. Install service + start daemon + ensure running |
mercury |
Start the agent (same as mercury start) |
mercury start |
Start in foreground |
mercury start -d |
Start in background (daemon mode) |
mercury restart |
Restart the background process |
mercury stop |
Stop a background process |
mercury logs |
View recent daemon logs |
mercury doctor |
Reconfigure (Enter to keep current values) |
mercury setup |
Re-run the setup wizard |
mercury status |
Show config and daemon status |
mercury help |
Show full manual |
mercury service install |
Install as system service (auto-start on boot) |
mercury service uninstall |
Uninstall system service |
mercury service status |
Show system service status |
mercury --verbose |
Start with debug logging |
Type these during a conversation โ they don't consume API tokens. Work on both CLI and Telegram.
| Command | Description |
|---|---|
/help |
Show the full manual |
/status |
Show agent config, budget, and usage |
/tools |
List all loaded tools |
/skills |
List installed skills |
/stream |
Toggle Telegram text streaming |
/stream off |
Disable streaming (single message) |
/budget |
Show token budget status |
/budget override |
Override budget for one request |
/budget reset |
Reset usage to zero |
/budget set <n> |
Change daily token budget |
| Category | Tools |
|---|---|
| Filesystem | read_file, write_file, create_file, edit_file, list_dir, delete_file, send_file |
| Shell | run_command, approve_command |
| Git | git_status, git_diff, git_log, git_add, git_commit, git_push |
| Web | fetch_url |
| Skills | install_skill, list_skills, use_skill |
| Scheduler | schedule_task, list_scheduled_tasks, cancel_scheduled_task |
| System | budget_status |
| Channel | Features |
|---|---|
| CLI | Readline prompt, real-time text streaming, markdown rendering, file display |
| Telegram | HTML formatting, file uploads (photos, audio, video, documents), typing indicators, /budget commands |
- Recurring:
schedule_taskwith cron expressions (0 9 * * *for daily at 9am) - One-shot:
schedule_taskwithdelay_seconds(e.g. 15 seconds) - Tasks persist to
~/.mercury/schedules.yamland restore on restart - Responses route back to the channel where the task was created
All runtime data lives in ~/.mercury/ โ not in your project directory.
| Path | Purpose |
|---|---|
~/.mercury/mercury.yaml |
Main config (providers, channels, budget) |
~/.mercury/soul/*.md |
Agent personality (soul, persona, taste, heartbeat) |
~/.mercury/permissions.yaml |
Capabilities and approval rules |
~/.mercury/skills/ |
Installed skills |
~/.mercury/schedules.yaml |
Scheduled tasks |
~/.mercury/token-usage.json |
Daily token usage tracking |
~/.mercury/memory/ |
Short-term, long-term, episodic memory |
~/.mercury/daemon.pid |
Background process PID |
~/.mercury/daemon.log |
Daemon mode logs |
Configure multiple LLM providers. Mercury tries them in order and falls back automatically:
- DeepSeek โ default, cost-effective
- OpenAI โ GPT-4o-mini and others
- Anthropic โ Claude and others
- TypeScript + Node.js 20+ โ ESM, tsup build, zero native dependencies
- Vercel AI SDK v4 โ
generateText+streamText, 10-step agentic loop, provider fallback - grammY โ Telegram bot with typing indicators and file uploads
- Flat-file persistence โ No database. YAML + JSON in
~/.mercury/ - Daemon manager โ Background spawn + PID file + watchdog crash recovery
- System services โ macOS LaunchAgent, Linux systemd, Windows Task Scheduler
MIT ยฉ Cosmic Stack
This is AI - it can break sometimes, please use this at your own risk.
For suggestions, contributions, or any inquiries, please reach out to us at support@cosmicstack.org.

