Developer Documentation

API Reference

OpenAI-compatible endpoints. Bring your own keys. PII-shielded by default.

POST /api/v1/chat/completions

OpenAI-compatible chat completion with streaming. Your message is PII-shielded before forwarding to the upstream provider. PII tokens are restored in the response.

curl -X POST /api/v1/chat/completions \
-H "Authorization: Bearer YOUR_KEY" \
-H "X-Upstream-Key: sk-openai-..." \
-d '{"model":"gpt-4o","messages":[{"role":"user","content":"Hello"}]}'
POST /api/shield/protect

Shield-only endpoint. Tokenise PII without sending to an LLM. Returns protected text with [PII:TYPE:N] tokens.

curl -X POST /api/shield/protect \
-H "Authorization: Bearer YOUR_KEY" \
-d '{"text":"My SSN is 123-45-6789"}'
# → {"protected":"My SSN is [PII:SSN:1]","tokens":1}
POST /api/shield/restore

Restore PII tokens back to original values. Requires the same session ID that created the tokens.

WS /ws/v1/stream

Real-time WebSocket streaming for chat completions. Connect with a one-time token from GET /ws-token.

GET /api/v1/graph/nodes

Query your knowledge graph. Returns nodes with connections, types, and metadata. Paginated.

Authentication

All API requests require a Bearer token in the Authorization header. Your access key is generated when you create a workspace.

For LLM inference, include your upstream provider key in the X-Upstream-Key header, or configure it in your workspace settings.