freshcrate
Home > MCP Servers > gollem

gollem

Go framework for agentic AI app with MCP and built-in tools

Description

Go framework for agentic AI app with MCP and built-in tools

README

🤖 gollem Go Reference Test Lint Gosec Trivy

GO for Large LanguagE Model (GOLLEM)

gollem provides:

  • Common interface to query prompt to Large Language Model (LLM) services
    • Generate / Stream: Generate text content from prompt (with per-call option overrides)
    • GenerateEmbedding: Generate embedding vector from text (OpenAI and Gemini)
  • Framework for building agentic applications of LLMs with
    • Tools by MCP (Model Context Protocol) server and your built-in tools
    • Automatic session management for continuous conversations
    • Portable conversational memory with history for stateless/distributed applications
    • Intelligent memory management with automatic history compaction
    • Middleware system for monitoring, logging, and controlling agent behavior

Supported LLMs

Install

go get github.com/m-mizutani/gollem

Quick Start

package main

import (
	"context"
	"fmt"
	"os"

	"github.com/m-mizutani/gollem"
	"github.com/m-mizutani/gollem/llm/openai"
)

func main() {
	ctx := context.Background()

	// Create LLM client
	client, err := openai.New(ctx, os.Getenv("OPENAI_API_KEY"))
	if err != nil {
		panic(err)
	}

	// Create session for one-time query
	session, err := client.NewSession(ctx)
	if err != nil {
		panic(err)
	}

	// Generate content
	result, err := session.Generate(ctx, []gollem.Input{gollem.Text("Hello, how are you?")})
	if err != nil {
		panic(err)
	}

	fmt.Println(result.Texts)
}

Features

Agent Framework

Build conversational agents with automatic session management and tool integration. Learn more →

agent := gollem.New(client,
	gollem.WithTools(&GreetingTool{}),
	gollem.WithSystemPrompt("You are a helpful assistant."),
)

// Session is managed automatically across calls
agent.Execute(ctx, "Hello!")
agent.Execute(ctx, "What did I just say?") // remembers context

Tool Integration

Define custom tools for LLMs to call, or connect external tools via MCP. Tools → | MCP →

// Custom tool - implement Spec() and Run()
type SearchTool struct{}

func (t *SearchTool) Spec() gollem.ToolSpec {
	return gollem.ToolSpec{
		Name:        "search",
		Description: "Search the database",
		Parameters:  map[string]*gollem.Parameter{
			"query": {Type: gollem.TypeString, Description: "Search query"},
		},
	}
}

func (t *SearchTool) Run(ctx context.Context, args map[string]any) (map[string]any, error) {
	return map[string]any{"results": doSearch(args["query"].(string))}, nil
}

// MCP server - connect external tool servers
mcpClient, _ := mcp.NewStdio(ctx, "./mcp-server", []string{})
agent := gollem.New(client,
	gollem.WithTools(&SearchTool{}),
	gollem.WithToolSets(mcpClient),
)

Multimodal Input

Send images and PDFs alongside text prompts. Learn more →

img, _ := gollem.NewImage(imageBytes)
pdf, _ := gollem.NewPDFFromReader(file)

result, _ := session.Generate(ctx, []gollem.Input{img, pdf, gollem.Text("Describe these.")})

Structured Output

Constrain LLM responses to a JSON Schema. Learn more →

schema, _ := gollem.ToSchema(UserProfile{})
session, _ := client.NewSession(ctx,
	gollem.WithSessionContentType(gollem.ContentTypeJSON),
	gollem.WithSessionResponseSchema(schema),
)
resp, _ := session.Generate(ctx, []gollem.Input{gollem.Text("Extract: John, 30, john@example.com")})
// resp.Texts[0] is valid JSON matching the schema

For one-shot queries, Query[T]() combines schema generation, session creation, LLM call, and JSON parsing into a single generic function call with automatic retry on parse failures:

type UserProfile struct {
	Name  string `json:"name" description:"User's full name"`
	Age   int    `json:"age" description:"Age in years"`
	Email string `json:"email" description:"Email address"`
}

result, _ := gollem.Query[UserProfile](ctx, client, "Extract: John, 30, john@example.com",
	gollem.WithQuerySystemPrompt("You are a data extractor."),
)
// result.Data is *UserProfile — type-safe, already parsed

To run a structured query on an existing session (preserving conversation history), use SessionQuery[T]():

// session already has conversation context from prior Generate calls
resp, _ := gollem.SessionQuery[UserProfile](ctx, session, "Who am I?")
// resp.Data is *UserProfile, parsed from the LLM's JSON response
// The session's history (including this exchange) is preserved

Middleware

Monitor, log, and control agent behavior with composable middleware. Learn more →

agent := gollem.New(client,
	gollem.WithToolMiddleware(func(next gollem.ToolHandler) gollem.ToolHandler {
		return func(ctx context.Context, req *gollem.ToolExecRequest) (*gollem.ToolExecResponse, error) {
			log.Printf("Tool called: %s", req.Tool.Name)
			return next(ctx, req)
		}
	}),
)

Strategy Pattern

Swap execution strategies: simple, ReAct, or Plan & Execute. Learn more →

import "github.com/m-mizutani/gollem/strategy/planexec"

agent := gollem.New(client,
	gollem.WithStrategy(planexec.New(client)),
	gollem.WithTools(&SearchTool{}, &AnalysisTool{}),
)

Tracing

Observe agent execution with pluggable backends (in-memory, OpenTelemetry). Learn more →

import "github.com/m-mizutani/gollem/trace"

rec := trace.New(trace.WithRepository(trace.NewFileRepository("./traces")))
agent := gollem.New(client, gollem.WithTrace(rec))

History Management

Portable conversation history for stateless/distributed applications. Learn more →

// Export history for persistence
history := agent.Session().History()
data, _ := json.Marshal(history)

// Restore in another process
var restored gollem.History
json.Unmarshal(data, &restored)
agent := gollem.New(client, gollem.WithHistory(&restored))

For automatic persistence, implement HistoryRepository and pass it via WithHistoryRepository. gollem then loads history at the start of a session and saves it after every LLM round-trip — no manual marshaling required.

agent := gollem.New(client,
    gollem.WithHistoryRepository(repo, "session-id"),
)

// History is loaded automatically on first Execute, and saved after each round-trip
err := agent.Execute(ctx, gollem.Text("Hello!"))

Examples

See the examples directory for complete working examples:

  • Simple: Minimal example for getting started
  • Query: Type-safe structured query with Query[T]()
  • Basic: Simple agent with custom tools
  • Chat: Interactive chat application
  • MCP: Integration with MCP servers
  • Tools: Custom tool development
  • JSON Schema: Structured output with JSON Schema validation
  • Embedding: Text embedding generation
  • Tracing: Agent execution tracing with file persistence

Documentation

License

Apache 2.0 License. See LICENSE for details.

Release History

VersionChangesUrgencyDate
v0.24.2## What's Changed * fix(trace): populate request messages in LLM call trace data by @m-mizutani in https://github.com/m-mizutani/gollem/pull/134 **Full Changelog**: https://github.com/m-mizutani/gollem/compare/v0.24.1...v0.24.2High4/12/2026
v0.24.1## What's Changed * feat(frontend): display token counts in span tree by @m-mizutani in https://github.com/m-mizutani/gollem/pull/130 * fix(frontend): upgrade recharts to v3 to resolve lodash vulnerabilities by @m-mizutani in https://github.com/m-mizutani/gollem/pull/131 * chore(deps): update Go 1.26, dependencies, and CI actions by @m-mizutani in https://github.com/m-mizutani/gollem/pull/133 **Full Changelog**: https://github.com/m-mizutani/gollem/compare/v0.24.0...v0.24.1High4/10/2026

Dependencies & License Audit

Loading dependencies...

Similar Packages

ralphglassesMulti-LLM agent orchestration TUI — parallel Claude/Gemini/Codex sessions, 126 MCP toolsv0.2.0
mcp-firewall🛡 Enforce security policies, redact data, sandbox processes, and verify integrity for Model Context Protocol (MCP) server communication.main@2026-04-21
tweetsave-mcp📝 Fetch Twitter/X content and convert it into blog posts using the MCP server for seamless integration and easy content management.main@2026-04-21
walmart-mcp🛒 Connect AI agents to Walmart's ecosystem using the Model Context Protocol for real-time data access and enhanced product search capabilities.main@2026-04-21
tekmetric-mcp🔍 Ask questions about your shop data in natural language and get instant answers about appointments, customers, and repair orders with Tekmetric MCP.main@2026-04-21