freshcrate
Home > MCP Servers > soulshack

soulshack

soulshack, an irc chatbot. openai/ollama/gemini/anthropic apis. basic shell tools and mcp server support.

Description

soulshack, an irc chatbot. openai/ollama/gemini/anthropic apis. basic shell tools and mcp server support.

README

Soulshack User Guide

soulshack

Soulshack is an advanced IRC chatbot powered by LLMs, designed to bridge traditional chat with modern AI capabilities.

Features

  • Multi-Provider Support: Works with OpenAI, Anthropic, Google Gemini, and Ollama.
  • Unified Tool System: Supports shell scripts, MCP servers, and native IRC tools.
  • Secure: Full SSL/TLS and SASL authentication support.
  • Session Management: Configurable history, context window, and session TTL.
  • Streaming: Real-time responses with IRC-appropriate chunking.
  • Passive Mode: Optional URL watching and analysis.
  • Runtime Configuration: Manage settings via IRC commands.

Quickstart

Option 1: Docker

docker build . -t soulshack:dev

Option 2: Build from Source

Prerequisites: Go 1.23+

  1. Clone and Build:

    git clone https://github.com/pkdindustries/soulshack.git
    cd soulshack
    go build -o soulshack cmd/soulshack/main.go
  2. Run:

    Configuration File (Recommended)

    Local Binary:

    ./soulshack --config examples/chatbot.yml

    Docker:

    # Mount config file to container
    docker run -v $(pwd)/examples/chatbot.yml:/config.yml soulshack:dev \
      --config /config.yml

    All Flags (Kitchen Sink)

    Local Binary:

    ./soulshack \
      --nick soulshack \
      --server irc.example.com \
      --port 6697 \
      --tls \
      --channel '#soulshack' \
      --saslnick mybot \
      --saslpass mypassword \
      --admins "admin!*@*" \
      --model openai/gpt-5.1 \
      --openaikey "sk-..." \
      --maxtokens 4096 \
      --temperature 1 \
      --apitimeout 5m \
      --tool "examples/tools/datetime.sh" \
      --tool "irc__op" \
      --thinkingeffort off \
      --urlwatcher \
      --verbose

    Docker:

    docker run soulshack:dev \
      --nick soulshack \
      --server irc.example.com \
      --port 6697 \
      --tls \
      --channel '#soulshack' \
      --saslnick mybot \
      --saslpass mypassword \
      --admins "admin!*@*" \
      --model openai/gpt-5.1 \
      --openaikey "sk-..." \
      --maxtokens 4096 \
      --temperature 1 \
      --apitimeout 5m \
      --thinkingeffort off \
      --urlwatcher \
      --verbose
    # Note: Local file tools/scripts require volume mounts to work in Docker

    Ollama (Local)

    Local Binary:

    ./soulshack \
      --server irc.example.com \
      --channel '#soulshack' \
      --model ollama/qwen3:30b \
      --ollamaurl "http://localhost:11434"

    Docker:

    # Use --network host to access Ollama on localhost
    docker run --network host soulshack:dev \
      --server irc.example.com \
      --channel '#soulshack' \
      --model ollama/qwen3:30b \
      --ollamaurl "http://localhost:11434"

    Anthropic

    Local Binary:

    ./soulshack \
      --server irc.example.com \
      --channel '#soulshack' \
      --model anthropic/claude-opus-4.5 \
      --anthropickey "sk-ant-..."

    Docker:

    docker run soulshack:dev \
      --server irc.example.com \
      --channel '#soulshack' \
      --model anthropic/claude-opus-4.5 \
      --anthropickey "sk-ant-..."

Configuration Flags

Flag Default Description
-n, --nick soulshack Bot nickname
-s, --server localhost IRC server address
-p, --port 6667 IRC server port
-c, --channel Channel to join
-e, --tls false Enable TLS
--tlsinsecure false Skip TLS cert verification
--saslnick SASL username
--saslpass SASL password
-b, --config Path to YAML config file
-A, --admins Comma-separated admin hostmasks
-V, --verbose false Enable debug logging
--model ollama/llama3.2 LLM model (provider/name)
--maxtokens 4096 Max tokens per response
--temperature 0.7 Sampling temperature
-t, --apitimeout 5m API request timeout
--openaikey OpenAI API key
--anthropickey Anthropic API key
--geminikey Google Gemini API key
--ollamaurl http://localhost:11434 Ollama API endpoint
--tool Path to tool definition (repeatable)
--thinkingeffort off Reasoning effort level: off, low, medium, high
--urlwatcher false Enable passive URL watching

YAML Configuration

Create a config.yml file:

server:
  nick: "soulshack"
  server: "irc.example.com"
  port: 6697
  channel: "#soulshack"
  tls: true

bot:
  admins: ["nick!user@host"]
  tools:
    - "examples/tools/datetime.sh"
    - "examples/mcp/filesystem.json"

Run with: ./soulshack --config config.yml

Commands

Command Admin? Description
/help No Show available commands
/version No Show bot version
/tools No List loaded tools
/tools add <spec> Yes Add a tool at runtime
/tools remove <pattern> Yes Remove a tool
/admins Yes List admins
/admins add <hostmask> Yes Add an admin
/set <key> <value> Yes Set config parameter
/get <key> No Get config parameter

Built-in Tools

Soulshack comes with native IRC management tools (permissions apply):

  • irc_op, irc_deop: Grant/revoke operator status.
  • irc_kick, irc_ban, irc_unban: User management.
  • irc_topic: Set channel topic.
  • irc_invite: Invite users to channel.
  • irc_mode_set, irc_mode_query: Manage channel modes.
  • irc_names, irc_whois: User information.

Documentation


Named as tribute to my old friend dayv, sp0t, who i think of often.

Release History

VersionChangesUrgencyDate
v0.93new commands /stats - display comprehensive session statistics: token usage (input/output) context capacity percentage message counts by role (user, assistant, tool) participant count session TTL and expiry countdown /tools load and /tools rm - aliases for add and remove new flags --maxcontext - set maximum context window tokens (enables capacity warnings) --channelkey - join password-protected channels --stream - togglLow12/13/2025
v0.92## docs - cleaned up readme.md - added docs/architecture.md - added docs/contributing.md - added architecture diagram ## internal - abstract tooling from girc implementation ## bugfixes - fixed openai endpoint url Low12/3/2025
v0.91# v0.91 uses latest [polly](https://github.com/alexschlessinger/pollytool) for tool use improvements including parallel toolcalls. new flag: - `urlwatcher` - respond to urls without being addressed (for use with eg a browser tool) new commands: - `/admins` - list current admins - `/admins add <hostmask>` - add admin - `/admins remove <hostmask>` - remove admin - `/tools` - list loaded tools - `/tools add <path>` - load tool - `/tools remove <pattern>` - unload by name or wildcLow12/2/2025
v0.89## What's Changed refactors to use [polly](https://github.com/alexschlessinger/pollytool) package for llm stuff we get real streaming in return and net -1800 loc for the bot new: --thinking to enable reasoning --showthinkingactions to announce delays due to reasoning after 5s --showtoolactions to show toolcalls as an irc action breaking: --mcptools -> --mcptool --shelltools -> --shelltool --irctools -> --irctool **Full Changelog**: https://github.com/pkdindustries/soulshack/coLow8/19/2025
v0.87Resolved an issue where multiple tool calls in a single assistant response would cause "tool_call_id did not have response messages" errors.Low8/10/2025
v0.86 New Features MCP (Model Context Protocol) Server Support - Added full MCP server integration - Soulshack can now connect to and use MCP servers as tool providers - Dynamic tool loading - Automatically discovers and loads tools from MCP servers at startup - Configuration support - New mcpservers configuration block in YAML config files - Command-line support - Added --mcpservers flag to specify MCP servers directly Bug Fixes - Fixed array property handling for all LLow8/10/2025
v0.82 Multi-provider LLM support - Support for Anthropic Claude, Google Gemini, and Ollama alongside OpenAI - Models now specified with provider prefix: provider/model - openai/gpt-4o - anthropic/claude-3-7-sonnet - gemini/gemini-2.5-flash - ollama/llama3.2 What's New - Anthropic Claude support with tool calling - Google Gemini integration with tools - Ollama for local model hosting - Refactored tool system with provider-specific converters - IncreasLow8/8/2025
v0.74### features tool use now supported on ollama served models (see [ollamabot.yml](examples/ollamabot.yml)) added [documents.py](examples/tools/documents.py), an example tool for fetching documents or news articles ### config *--openaikey* renamed *--apikey* *--openapiurl* renamed *--apiurl* *--stream <bool>* introduced to toggle api streaming support *--admins* now uses hostmasks ### streamlined shell tool api the --name and --description command interfaces have been removed from Low10/21/2024
v0.72Release v0.7 Highlights: Cleaner, more streamlined configuration options Added support for OpenAI tool use (function calling) Includes sample tools for weather, unix system info, datetime and more Include built in IRC tools for /op, /topic, and /kick Various bug fixes and stability enhancements Added configuration options --top_p: Top P value for the completion (default: 1) --tools: Enable tool use (default: false) --toolsdir: Directory to load tools from (default: "examples/toolsLow10/13/2024
v0.42.17## What's Changed * added SASL plain auth by @terminaldweller in https://github.com/pkdindustries/soulshack/pull/10 * Have dependabot update github actions by @tvalenta in https://github.com/pkdindustries/soulshack/pull/12 * The one run through the linter by @tvalenta in https://github.com/pkdindustries/soulshack/pull/15 * The one with gh cli for releases by @tvalenta in https://github.com/pkdindustries/soulshack/pull/28 * Allow alternate OpenAI API URL with --openaiurl by @lstarnes1024 in Low8/8/2024
v0.42.16-alphaRelease v0.42.16-alphaLow4/17/2023
v0.42.15-alphaRelease v0.42.15-alphaLow4/14/2023
v0.42.9-alphaRelease v0.42.9-alphaLow4/11/2023

Dependencies & License Audit

Loading dependencies...

Similar Packages

AgenvoyAgentic framework | Self-improving memory | Pluggable tool extensions | Sandbox executionv0.19.4
agentic-flowProduction-ready AI agent orchestration platform with 66 specialized agents, 213 MCP tools, ReasoningBank learning memory, and autonomous multi-agent swarms. Built by @ruvnet with Claude Agent SDK, ne2.0.7
casibase⚑️AI Cloud OS: Open-source enterprise-level AI knowledge base and MCP (model-context-protocol)/A2A (agent-to-agent) management platform with admin UI, user management and Single-Sign-On⚑️, supports Chv1.771.3
claude-container🐳 Run Claude Code safely in isolated Docker containers with persistent projects and easy setup on macOS using Justfile automation.master@2026-04-21
samplesAgent samples built using the Strands Agents SDK.main@2026-04-20