-
-
Notifications
You must be signed in to change notification settings - Fork 169
2.3.68 Satellite OpenCode
Handle:
opencode
URL: http://localhost:34711

OpenCode is an AI-powered coding assistant that provides a server API, terminal UI, desktop app, and IDE extensions. It supports multiple LLM providers (Anthropic, OpenAI, Google, OpenRouter, etc.), offers advanced agentic workflows with built-in tools (file operations, web search, LSP integration), and enables multi-session management with context awareness.
Key Features:
- Server API with OpenAPI documentation at
/doc - Terminal UI (TUI) via
opencode attach - Multiple LLM provider support (local via Ollama or cloud)
- Built-in tools: file operations, web search, code formatting, LSP integration
- Session management with context compaction
- MCP (Model Context Protocol) server integration
- Plugin system for custom tools and agents
- Workspace-aware file operations
# Pull the image
harbor pull opencode
# Start the service and open its WebUI in the browser
harbor up opencode --openSee harbor pull and harbor up for more options.
- Harbor will connect
opencodetoollama,llamacpp,vllmand other internal inference engines automatically - Harbor will automatically discover avaialble models and pre-populate them for the
opencode
Access the OpenAPI documentation at http://localhost:34711/doc.
The server provides REST and WebSocket endpoints for:
- Session management
- Message streaming
- Agent execution
- Tool invocation
Harbor provides an opencode subcommand to manage workspaces:
# Add a workspace (updates HARBOR_OPENCODE_WORKSPACES)
harbor opencode workspaces add ~/projects/my-app
# Remove a workspace
harbor opencode workspaces rm ~/projects/my-app
# List configured workspaces
harbor opencode workspaces lsOptional basic auth can be configured for the server:
harbor config set opencode.username "myuser"
harbor config set opencode.password "mypassword"After setting credentials, restart the service for changes to take effect. See harbor config and harbor restart for details.
Following options can be set via harbor config:
# Main server port
HARBOR_OPENCODE_HOST_PORT 34711
# Persistent storage
HARBOR_OPENCODE_DATA ./opencode/data
HARBOR_OPENCODE_CONFIG ./opencode/config
# Basic authentication (optional)
HARBOR_OPENCODE_USERNAME ""
HARBOR_OPENCODE_PASSWORD ""
# Workspace mounts (semicolon-separated paths)
HARBOR_OPENCODE_WORKSPACES ""-
HARBOR_OPENCODE_DATA(./opencode/data) →/root/.local/share/opencode- Session data, logs, and auth tokens -
HARBOR_OPENCODE_CONFIG(./opencode/config) →/root/.config/opencode- Configuration files - Workspace directories (configured via
HARBOR_OPENCODE_WORKSPACES) →/root/<name>- Project files for coding tasks
OpenCode supports mounting multiple workspace directories for file operations.
Edit the HARBOR_OPENCODE_WORKSPACES variable in your profile using harbor config:
# Semicolon-separated list of paths
harbor config set opencode.workspaces "~/projects/app1;~/projects/app2;~/code/lib"Workspace paths can be:
- Absolute paths:
/home/user/projects/my-app - Relative paths:
./projects/my-app(relative to Harbor home) - Home-relative:
~/projects/my-app
Each workspace is mounted as /root/<basename> in the container.
The harbor opencode workspaces command provides a convenient interface:
# Add workspace
harbor opencode workspaces add ~/projects/my-app
# Remove workspace
harbor opencode workspaces rm ~/projects/my-app
# List workspaces
harbor opencode workspaces lsOpenCode supports automatic model discovery from Harbor inference backends. When starting OpenCode alongside compatible backends, the service automatically detects available models and configures them for use.
The following Harbor backends are supported for auto-discovery:
-
ollama- Ollama inference server -
llamacpp- llama.cpp server -
vllm- vLLM inference engine -
tabbyapi- TabbyAPI (ExLlamaV2) -
mistralrs- mistral.rs inference -
sglang- SGLang inference -
lmdeploy- LMDeploy inference
When OpenCode starts with one or more compatible backends, the entrypoint script:
- Waits for each backend to become available (30 second timeout per backend)
- Queries the
/v1/modelsendpoint on each backend - Generates an
opencode.jsonconfiguration with discovered providers and models - The backends appear as "BackendName (Harbor)" in the model selector
This happens at container startup. Models added to backends after startup require a restart:
harbor restart opencodeStart OpenCode with one or more backends:
# With Ollama
harbor up ollama opencode
# With llama.cpp
harbor up llamacpp opencode
# With multiple backends
harbor up ollama vllm opencodeSee harbor up for more startup options.
If auto-discovery is not desired, or for additional providers (Anthropic, OpenAI, etc.), configure models manually in opencode.json located in HARBOR_OPENCODE_CONFIG.
Configure via environment variables:
-
OPENCODE_AUTO_SHARE- Auto-share sessions -
OPENCODE_ENABLE_EXPERIMENTAL_MODELS- Enable experimental model support
OpenCode supports extensive configuration via environment variables:
# Configuration paths
OPENCODE_CONFIG # Path to custom opencode.json
OPENCODE_CONFIG_DIR # Custom config directory
OPENCODE_CONFIG_CONTENT # Inline JSON configuration
# Feature flags
OPENCODE_AUTO_SHARE # Auto-share sessions
OPENCODE_DISABLE_AUTOUPDATE # Disable update checks
OPENCODE_DISABLE_PRUNE # Disable data pruning
OPENCODE_DISABLE_DEFAULT_PLUGINS # Disable default plugins
OPENCODE_DISABLE_LSP_DOWNLOAD # Disable LSP server downloads
OPENCODE_ENABLE_EXPERIMENTAL_MODELS # Enable experimental models
OPENCODE_DISABLE_AUTOCOMPACT # Disable context compaction
OPENCODE_ENABLE_EXA # Enable Exa web search
# Authentication
OPENCODE_SERVER_USERNAME # Server basic auth username
OPENCODE_SERVER_PASSWORD # Server basic auth password
# Permissions (inline JSON)
OPENCODE_PERMISSION # Custom permission configurationSet these via the service's override.env file:
# Edit opencode/override.env
OPENCODE_ENABLE_EXPERIMENTAL_MODELS=true
OPENCODE_AUTO_SHARE=true- Server listens on
0.0.0.0:4096inside the container (mapped to host port 34711) - Session data and auth tokens are persisted in
HARBOR_OPENCODE_DATA - Configuration files are persisted in
HARBOR_OPENCODE_CONFIG - OpenAPI documentation available at http://localhost:34711/doc
- WebSocket endpoint for message streaming:
ws://localhost:34711/api/sessions/{session_id}/stream - Requires workspace mounts for file operations (use
harbor opencode workspace add)
See the Harbor Troubleshooting Guide for general help.
Service won't start:
- Check logs:
harbor logs opencode - Verify port 34711 is not in use:
lsof -i :34711
Can't access files in workspace:
- Ensure workspace is added via
harbor opencode workspaces add - Verify workspace path exists and is readable
- Check volume mounts:
docker inspect harbor.opencode
TUI won't attach:
- Ensure server is running:
harbor ps opencode - Check container logs for errors:
harbor logs opencode
Authentication not working:
- Verify credentials are set:
harbor config get opencode.username - Restart service after changing credentials:
harbor restart opencode