Agents (ACP + HTTP)
Interoperability without boredom. Lock-in kills trust.
Hash speaks two protocols so you can:
- Run cloud agents (Claude, GPT, Gemini)
- Run local models (Ollama, llama.cpp)
- Switch without changing your workflow
ACP (Agent Client Protocol)
For agents that run as local processes. Uses JSON-RPC 2.0 over stdin/stdout.
~/.config/hash/config.toml
[agent]
command = "claude-code-acp"
command = "claude-code-acp"
Claude Code Setup
To use Anthropic's Claude Code with Hash, first install the Claude CLI using the native installer:
Terminal
curl -fsSL https://claude.ai/install.sh | bash
Then, install the ACP adapter which allows Hash to communicate with Claude:
Terminal
npm install -g @zed-industries/claude-code-acp
Note: Ensure you have authenticated by running claude once, then configure Hash to use claude-code-acp as the agent command.
When to use ACP
- Claude Code, Gemini CLI, or any tool with ACP support
- You want the agent to have full context (files, git, etc.)
- Session persistence matters
Behavior
- Lazy connect: connects on first
??use - One session per shell instance
- Closed on shell exit
HTTP transport
For HTTP-based model servers. Works with Ollama, vLLM, or any OpenAI-compatible endpoint.
~/.config/hash/config.toml
[agent]
transport = "http"
url = "http://localhost:11434/api/generate"
model = "codellama:13b"
transport = "http"
url = "http://localhost:11434/api/generate"
model = "codellama:13b"
When to use HTTP
- Ollama or local model servers
- Self-hosted inference
- You want full control over the model
Agent config options
~/.config/hash/config.toml
[agent]
command = "claude-code-acp" # ACP agent command
timeout = "120s" # Max response wait (default: 120s)
# HTTP transport (alternative)
# transport = "http"
# url = "http://localhost:11434/api/generate"
# model = "codellama:13b"
command = "claude-code-acp" # ACP agent command
timeout = "120s" # Max response wait (default: 120s)
# HTTP transport (alternative)
# transport = "http"
# url = "http://localhost:11434/api/generate"
# model = "codellama:13b"
Debug checklist
| Symptom | Check |
|---|---|
| "command not found" | Is command in your PATH? |
| Timeout | Increase timeout, check model is loaded |
| Empty response | Check model name matches server |
| Auth errors | Check API keys are set in environment |
| Slow startup | Normal — ACP connects lazily on first ?? |
Check the agent command works directly in your terminal before configuring it in Hash.