Harbangan
A multi-user Rust proxy gateway. Use OpenAI and Anthropic client libraries with Kiro, Anthropic, OpenAI Codex, GitHub Copilot, and custom providers. Both Proxy-Only Mode and Full Deployment support all providers — a single container is all you need to get started. Per-user auth, content guardrails, and real-time streaming.
The Name
In Batak Toba culture, the harbangan is the gate of the traditional house — a threshold between the ordered world of family and community, and the open world beyond. In Batak cosmology, the universe is divided into three realms, and every threshold mirrors that cosmic boundary. To cross a harbangan is to move between states of being.
This gateway embodies the same philosophy:
- Cosmic boundary — The harbangan separates the three realms of Batak cosmology. This gateway sits at the boundary between your client code and multiple provider backends (Kiro, Anthropic, OpenAI Codex, Copilot, Custom), translating between OpenAI and Anthropic formats on either side.
- Guardian of social order — The Batak gate enforces Dalihan Na Tolu, the three-pillar kinship system that governs who may enter and how. Harbangan enforces multi-user RBAC: Google SSO, per-user API keys, admin/user roles, and domain allowlisting decide what passes through.
- Ritual transition — Crossing a harbangan signals a shift in status. Requests crossing this gateway undergo their own transformation: format conversion, content guardrails (CEL rules + AWS Bedrock), and provider routing before reaching the other side.
- Openness as moral virtue — In Batak ethics, a gate that is always open signals generosity and communal spirit. This one is open source, and in proxy-only mode, a single container is all you need to open the gate.
Further reading on Batak Toba philosophy: Form and Meaning of Batak Toba House · Dalihan Na Tolu: Vision of Integrity · Batak Cultural Values
How It Works
Harbangan sits between your existing AI client code and multiple provider backends. Send requests in OpenAI or Anthropic format – the gateway translates them on the fly, handles per-user authentication with role-based access control, applies content guardrails, and streams responses back in the format your client expects.
flowchart TD
subgraph Clients
OAI["OpenAI Client"]
ANT["Anthropic Client"]
end
subgraph Docker["Docker Compose"]
subgraph GW["Backend"]
MW["Middleware\n(CORS, Auth)"]
GUARD["Guardrails\n(CEL + Bedrock)"]
CONV["Format Converters"]
STREAM["Stream Parser"]
end
end
subgraph Providers["Provider Backends"]
KIRO["Kiro API\n(CodeWhisperer)"]
ANTHROPIC["Anthropic API"]
OPENAI["OpenAI API"]
COPILOT["GitHub Copilot"]
CUSTOM["Custom Endpoint"]
end
subgraph Auth["Authentication"]
SSO["Google SSO / Password+2FA"]
end
OAI --> MW
ANT --> MW
MW --> GUARD
GUARD --> CONV
CONV --> KIRO
CONV --> ANTHROPIC
CONV --> OPENAI
CONV --> COPILOT
CONV --> CUSTOM
KIRO --> STREAM
ANTHROPIC --> STREAM
OPENAI --> STREAM
COPILOT --> STREAM
CUSTOM --> STREAM
STREAM --> OAI
STREAM --> ANT
GW -.-> SSO
Features
OpenAI Compatible
Drop-in replacement for the OpenAI API. Use any OpenAI client library -- just point it at the gateway.
Anthropic Compatible
Full support for the Anthropic Messages API, including system prompts, tool use, and content blocks.
Multi-Provider
Connect to Kiro (AWS CodeWhisperer), Anthropic, OpenAI Codex, GitHub Copilot, and custom endpoints with per-user credentials and automatic token refresh. All providers work in both deployment modes.
Real-time Streaming
Parses provider-specific binary formats and converts to standard SSE in real time. Supports AWS Event Stream and chunked transfer.
Multi-User RBAC
Google SSO or password + TOTP 2FA for web UI, per-user API keys for programmatic access. Admin and User roles with domain allowlisting.
Extended Thinking
Extracts reasoning blocks from model responses and maps them to native thinking/reasoning content fields.
Content Guardrails
AWS Bedrock-powered content validation with CEL rule engine. Validate input and output with configurable sampling and fail-open design.
Quick Start
# Clone and configure
git clone https://github.com/if414013/harbangan.git
cd harbangan
cp .env.example .env
# Edit .env — set POSTGRES_PASSWORD (and optionally INITIAL_ADMIN_* for password auth)
# Start all services
docker compose up -d --build
Then open https://your-domain.com/_ui/ to complete setup via Google SSO.
For proxy-only mode (single container, no database):
cp .env.proxy.example .env.proxy
# Edit .env.proxy and set PROXY_API_KEY (min 16 chars)
docker compose -f docker-compose.gateway.yml --env-file .env.proxy up -d
On first start, the container prints an AWS device code URL — open it in a browser to authenticate with Kiro. Credentials are cached to a Docker volume for automatic restarts.
Documentation
API Endpoints
| Endpoint | Method | Description |
|---|---|---|
/v1/chat/completions |
POST | OpenAI-compatible chat completions |
/v1/messages |
POST | Anthropic-compatible messages |
/v1/models |
GET | List available models |
/health |
GET | Health check |
/_ui/ |
GET | Web dashboard |