Skip to content

2.3.68 Satellite OpenCode

av edited this page Jan 17, 2026 · 2 revisions

Handle: opencode
URL: http://localhost:34711

OpenCode Screenshot

OpenCode is an AI-powered coding assistant that provides a server API, terminal UI, desktop app, and IDE extensions. It supports multiple LLM providers (Anthropic, OpenAI, Google, OpenRouter, etc.), offers advanced agentic workflows with built-in tools (file operations, web search, LSP integration), and enables multi-session management with context awareness.

Key Features:

  • Server API with OpenAPI documentation at /doc
  • Terminal UI (TUI) via opencode attach
  • Multiple LLM provider support (local via Ollama or cloud)
  • Built-in tools: file operations, web search, code formatting, LSP integration
  • Session management with context compaction
  • MCP (Model Context Protocol) server integration
  • Plugin system for custom tools and agents
  • Workspace-aware file operations

Starting

# Pull the image
harbor pull opencode

# Start the service and open its WebUI in the browser
harbor up opencode --open

See harbor pull and harbor up for more options.

  • Harbor will connect opencode to ollama, llamacpp, vllm and other internal inference engines automatically
  • Harbor will automatically discover avaialble models and pre-populate them for the opencode

Usage

Server API

Access the OpenAPI documentation at http://localhost:34711/doc.

The server provides REST and WebSocket endpoints for:

  • Session management
  • Message streaming
  • Agent execution
  • Tool invocation

CLI Subcommand

Harbor provides an opencode subcommand to manage workspaces:

# Add a workspace (updates HARBOR_OPENCODE_WORKSPACES)
harbor opencode workspaces add ~/projects/my-app

# Remove a workspace
harbor opencode workspaces rm ~/projects/my-app

# List configured workspaces
harbor opencode workspaces ls

Basic Authentication

Optional basic auth can be configured for the server:

harbor config set opencode.username "myuser"
harbor config set opencode.password "mypassword"

After setting credentials, restart the service for changes to take effect. See harbor config and harbor restart for details.

Configuration

Following options can be set via harbor config:

# Main server port
HARBOR_OPENCODE_HOST_PORT          34711

# Persistent storage
HARBOR_OPENCODE_DATA               ./opencode/data
HARBOR_OPENCODE_CONFIG             ./opencode/config

# Basic authentication (optional)
HARBOR_OPENCODE_USERNAME           ""
HARBOR_OPENCODE_PASSWORD           ""

# Workspace mounts (semicolon-separated paths)
HARBOR_OPENCODE_WORKSPACES         ""

Volumes

  • HARBOR_OPENCODE_DATA (./opencode/data) → /root/.local/share/opencode - Session data, logs, and auth tokens
  • HARBOR_OPENCODE_CONFIG (./opencode/config) → /root/.config/opencode - Configuration files
  • Workspace directories (configured via HARBOR_OPENCODE_WORKSPACES) → /root/<name> - Project files for coding tasks

Workspace Management

OpenCode supports mounting multiple workspace directories for file operations.

Manual Configuration

Edit the HARBOR_OPENCODE_WORKSPACES variable in your profile using harbor config:

# Semicolon-separated list of paths
harbor config set opencode.workspaces "~/projects/app1;~/projects/app2;~/code/lib"

Workspace paths can be:

  • Absolute paths: /home/user/projects/my-app
  • Relative paths: ./projects/my-app (relative to Harbor home)
  • Home-relative: ~/projects/my-app

Each workspace is mounted as /root/<basename> in the container.

Using CLI Subcommand

The harbor opencode workspaces command provides a convenient interface:

# Add workspace
harbor opencode workspaces add ~/projects/my-app

# Remove workspace
harbor opencode workspaces rm ~/projects/my-app

# List workspaces
harbor opencode workspaces ls

Backend Integration

OpenCode supports automatic model discovery from Harbor inference backends. When starting OpenCode alongside compatible backends, the service automatically detects available models and configures them for use.

Supported Backends

The following Harbor backends are supported for auto-discovery:

  • ollama - Ollama inference server
  • llamacpp - llama.cpp server
  • vllm - vLLM inference engine
  • tabbyapi - TabbyAPI (ExLlamaV2)
  • mistralrs - mistral.rs inference
  • sglang - SGLang inference
  • lmdeploy - LMDeploy inference

Model Auto-Discovery

When OpenCode starts with one or more compatible backends, the entrypoint script:

  1. Waits for each backend to become available (30 second timeout per backend)
  2. Queries the /v1/models endpoint on each backend
  3. Generates an opencode.json configuration with discovered providers and models
  4. The backends appear as "BackendName (Harbor)" in the model selector

This happens at container startup. Models added to backends after startup require a restart:

harbor restart opencode

Usage

Start OpenCode with one or more backends:

# With Ollama
harbor up ollama opencode

# With llama.cpp
harbor up llamacpp opencode

# With multiple backends
harbor up ollama vllm opencode

See harbor up for more startup options.

Manual Configuration

If auto-discovery is not desired, or for additional providers (Anthropic, OpenAI, etc.), configure models manually in opencode.json located in HARBOR_OPENCODE_CONFIG.

Configure via environment variables:

  • OPENCODE_AUTO_SHARE - Auto-share sessions
  • OPENCODE_ENABLE_EXPERIMENTAL_MODELS - Enable experimental model support

Environment Variables

OpenCode supports extensive configuration via environment variables:

# Configuration paths
OPENCODE_CONFIG                    # Path to custom opencode.json
OPENCODE_CONFIG_DIR                # Custom config directory
OPENCODE_CONFIG_CONTENT            # Inline JSON configuration

# Feature flags
OPENCODE_AUTO_SHARE                # Auto-share sessions
OPENCODE_DISABLE_AUTOUPDATE        # Disable update checks
OPENCODE_DISABLE_PRUNE             # Disable data pruning
OPENCODE_DISABLE_DEFAULT_PLUGINS   # Disable default plugins
OPENCODE_DISABLE_LSP_DOWNLOAD      # Disable LSP server downloads
OPENCODE_ENABLE_EXPERIMENTAL_MODELS # Enable experimental models
OPENCODE_DISABLE_AUTOCOMPACT       # Disable context compaction
OPENCODE_ENABLE_EXA                # Enable Exa web search

# Authentication
OPENCODE_SERVER_USERNAME           # Server basic auth username
OPENCODE_SERVER_PASSWORD           # Server basic auth password

# Permissions (inline JSON)
OPENCODE_PERMISSION                # Custom permission configuration

Set these via the service's override.env file:

# Edit opencode/override.env
OPENCODE_ENABLE_EXPERIMENTAL_MODELS=true
OPENCODE_AUTO_SHARE=true

Notes

  • Server listens on 0.0.0.0:4096 inside the container (mapped to host port 34711)
  • Session data and auth tokens are persisted in HARBOR_OPENCODE_DATA
  • Configuration files are persisted in HARBOR_OPENCODE_CONFIG
  • OpenAPI documentation available at http://localhost:34711/doc
  • WebSocket endpoint for message streaming: ws://localhost:34711/api/sessions/{session_id}/stream
  • Requires workspace mounts for file operations (use harbor opencode workspace add)

Troubleshooting

See the Harbor Troubleshooting Guide for general help.

Common Issues

Service won't start:

Can't access files in workspace:

  • Ensure workspace is added via harbor opencode workspaces add
  • Verify workspace path exists and is readable
  • Check volume mounts: docker inspect harbor.opencode

TUI won't attach:

Authentication not working:

Clone this wiki locally