Gormes

OPEN SOURCE · MIT LICENSE · EARLY SCOUT RELEASE

Run Agents From One Go Binary.

Gormes runs AI agents as a single static binary.

No Python runtime. No virtualenv repair. No backend service just to open the UI.

Start offline, prove the machine works, then add provider and gateway credentials.

Scout release. Useful today, still early.

Offline TUI, doctor diagnostics, provider one-shots, Goncho memory, dashboard, and configured Telegram/Discord/Slack paths are covered. Full parity is still hardening.

~22.2 MB static binary Source build recommended MIT License Scout release

INSTALL

Build it. Prove it offline.

Start with the inspectable source path. The first proof does not need credentials, a model call, Python, Docker, or Hermes. No runtime Node or npm is required.

1. BUILD FROM SOURCE

git clone https://github.com/TrebuchetDynamics/gormes-agent.git
cd gormes-agent
make build

2. OFFLINE TUI

./bin/gormes --offline

3. LOCAL DOCTOR

./bin/gormes doctor --offline

Provider setup, gateway setup, and convenience installers come after the offline proof. Read the install docs ->

What works today

  • Run a local agent UI with zero runtime dependencies on the offline path
  • Send one-shot prompts to a provider-compatible endpoint
  • Validate your environment before spending tokens
  • Operate configured Telegram, Discord, or Slack agents from one binary
  • Inspect and debug agent memory locally with Goncho
  • Browse sessions, config, skills, and logs in the local dashboard

WHY GORMES

Python-stack agents fail for boring reasons.

The model is not usually the fragile part. Operations are:

  • dev, staging, and prod stop matching
  • virtualenvs and package wheels drift across hosts
  • long turns die on dropped streams
  • tool wiring fails after tokens are already burning

Gormes cuts out that failure class

Single Static Binary

CGO_ENABLED=0 release builds produce a ~22.2 MB artifact for the runtime surface.

Offline Proof

./bin/gormes --offline starts the native TUI without credentials, network calls, Python, Node, Docker, or Hermes.

Built-In Doctor

./bin/gormes doctor --offline checks local readiness before provider calls or token spend.

Provider Turns

One-shots and the TUI use configured provider-compatible endpoints from the same binary.

Local Goncho Memory

Sessions, durable context, diagnostics, and queue state stay in local SQLite.

Visible Limits

Full Hermes parity, broad channel parity, voice/TTS, MCP/plugin parity, and release hardening remain in progress.

BUILD STATE

Useful today, still early.

Current focus

  • Offline TUI, doctor diagnostics, provider one-shots, dashboard, and Goncho memory
  • Configured Telegram and Discord gateways; Slack with complete Socket Mode credentials
  • Go-native tool registry, web/browser tools, and subagent safety

Next milestone

Production-stable Go-native runtime with signed releases and broader Hermes parity

View full phase-by-phase checklist

Phase 1 — The Dashboard

SHIPPED · 4/4
Done 1.A Core TUI
Done 1.B Wire Doctor
Done 1.C Automation Reliability
Done 1.D Skill-Driven Control Plane

Phase 2 — The Gateway

SHIPPED · 21/21
Done 2.A Tool Registry
Done 2.B.1 Telegram Scout
Done 2.B.10 WeChat Adapter
Done 2.B.11 Discord Forum Channels
Done 2.B.12 Channel-Neutral Native Runtime Adapter
Done 2.B.2 Gateway Chassis + Discord
Done 2.B.3 Slack on Shared Chassis
Done 2.B.4 WhatsApp Adapter
Done 2.B.5 Session Context + Delivery Routing
Done 2.C Thin Mapping Persistence
Done 2.D Cron / Scheduled Automations
Done 2.E.0 OS-AI Spine: Deterministic Subagent Runtime
Done 2.E.1 OS-AI Spine: Delegation Policy + Child Execution
Done 2.E.2 OS-AI Spine: Concurrent-Tool Cancellation
Done 2.E.3 OS-AI Spine: Durable Job Resilience
Done 2.F.1 Slash Command Registry + Gateway Dispatch
Done 2.F.2 Hook Registry + BOOT.md
Done 2.F.3 Restart / Pairing / Status
Done 2.F.4 Home Channel + Operator Surfaces
Done 2.F.5 Gateway Mid-Run Steering + Active-Turn Policy
Done 2.G OS-AI Spine: Skills Runtime

Phase 3 — The Black Box (Memory)

SHIPPED · 15/15
Done 3.A SQLite + FTS5 Lattice
Done 3.B Ontological Graph + LLM Extractor
Done 3.C Neural Recall + Context Injection
Done 3.D Semantic Fusion + Local Embeddings
Done 3.D.5 Memory Mirror (USER.md sync)
Done 3.E.1 Session Index Mirror
Done 3.E.2 Tool Execution Audit Log
Done 3.E.3 Transcript Export Command
Done 3.E.4 Extraction State Visibility
Done 3.E.5 Insights Audit Log
Done 3.E.6 Memory Decay
Done 3.E.7 Cross-Chat Synthesis
Done 3.E.8 Session Lineage + Cross-Source Search
Done 3.F Goncho Honcho Memory Parity
Done 3.G Goncho Drop-In Compatibility Closure

Phase 4 — The Brain Transplant

IN PROGRESS · 7/13
Now 4.A Provider Adapters
Now 4.B Context Engine + Compression
Done 4.C Native Prompt Builder
Now 4.D Smart Model Routing
Now 4.E Trajectory + Insights
Done 4.F Title Generation
Done 4.G Credentials + OAuth
Done 4.H Rate / Retry / Caching
Done 4.I Native Agent Turn Closure
Done 4.J Permission-Hardened Tool Execution
Done 4.K Provider Fallback Chain
Next 4.L Safety-Anchored Turn Loop (MOSAIC)
Next 4.M Advanced Provider Routing

Phase 5 — The Final Purge

IN PROGRESS · 5/22
Now 5.A Tool Surface Port
Now 5.B Sandboxing Backends
Now 5.C Browser Automation
Now 5.D Vision + Image Generation
Now 5.E TTS / Voice / Transcription
Now 5.F Skills System (Remaining)
Done 5.G MCP Integration
Now 5.H ACP Integration
Now 5.I Plugins Architecture
Now 5.J Approval / Security Guards
Done 5.K Code Execution
Now 5.L File Ops + Patches
Done 5.M Mixture of Agents
Now 5.N Misc Operator Tools
Now 5.O Hermes CLI Parity
Now 5.P Docker / Packaging
Now 5.Q API Server + TUI Gateway Streaming
Next 5.R Code Execution Mode Policy
Done 5.S Loop Detection
Done 5.T Browser Harness Doctor
Next 5.U Fault-Tolerant Sandbox Execution
Next 5.V Unified Event Bus

Phase 6 — The Learning Loop (Soul)

IN PROGRESS · 6/12
Next 6.A Complexity Detector
Done 6.B Skill Extractor
Done 6.C Skill Storage Format
Done 6.D Skill Retrieval + Matching
Next 6.E Feedback Loop
Now 6.F Skill Surface
Done 6.G Structured Memory Types
Done 6.H Skill Metadata Placement
Done 6.I Zero-LLM Knowledge Graph
Next 6.J Agentic Memory Lifecycle (AgeMem)
Next 6.K Self-Evolution Engine (GEPA)
Next 6.L Composable Skill Execution (Voyager)

Phase 7 — Paused Channel Backlog

IN PROGRESS · 2/5
Now 7.A Signal Adapter
Done 7.B Email + SMS Adapters
Now 7.C Matrix + Mattermost Adapters
Done 7.D Webhook + Trigger Ingress
Now 7.E Regional + Device Adapter Backlog

Start offline. Add credentials later.

The offline path proves the runtime before provider calls, gateway traffic, or token spend.