APISIX Integration
Integration with Apache APISIX
Section titled βIntegration with Apache APISIXβSafeLLM is designed to work seamlessly as a sidecar for Apache APISIX, providing a transparent security layer for all your LLM traffic.
Deep Content Inspection (Lua + resty.http)
Section titled βDeep Content Inspection (Lua + resty.http)βThe integration uses APISIXβs serverless-pre-function plugin with the resty.http Lua library. This allows APISIX to:
- Read the full request body from Nginx memory
- POST it to SafeLLMβs
/authendpoint as a standard HTTP request - Block or allow based on the response
Benefits of This Approach
Section titled βBenefits of This Approachβ| Feature | forward-auth | Lua + resty.http |
|---|---|---|
| Access to request body | β No | β Yes |
| Header size limits | β οΈ Problematic for large prompts | β No limits (body is POST) |
| Fail-open control | Limited | β Granular |
| Body stored in memory | N/A | β Direct Nginx access |
How It Works
Section titled βHow It Worksβββββββββββββ βββββββββββββββββββββββββββββββββββββββ βββββββββββββ Client βββββΆβ APISIX (Lua serverless-pre-function)βββββΆβ LLM βββββββββββββ βββββββββββββββββββββββββββββββββββββββ ββββββββββββ β β POST /auth (body) βΌ ββββββββββββββ β SafeLLM β β Sidecar β ββββββββββββββ β βββββββββββ΄ββββββββββ βΌ βΌ 200 OK 403 Forbidden (Continue) (Block)Flow Details
Section titled βFlow Detailsβ- Request Arrival: A client sends an OpenAI-compatible request to APISIX.
- Security Check (Pre-function): Lua script reads the body and POSTs it to SafeLLMβs
/authendpoint. - Validation: SafeLLM processes the request through its Waterfall Pipeline (Cache β Keywords β PII).
- Decision:
- If safe: Sidecar returns
200 OK, APISIX forwards to upstream. - If unsafe: Sidecar returns
403 Forbidden, APISIX blocks the request.
- If safe: Sidecar returns
Route Configuration Example
Section titled βRoute Configuration Exampleβroutes: - uri: /api/* plugins: serverless-pre-function: phase: rewrite functions: - | return function(conf, ctx) local core = require("apisix.core") local http = require("resty.http")
-- Configuration local FAIL_OPEN = false local TIMEOUT_MS = 5000
-- Read body ngx.req.read_body() local body = ngx.req.get_body_data()
if not body or #body == 0 then return -- No body to scan end
-- Send to SafeLLM local httpc = http.new() httpc:set_timeout(TIMEOUT_MS)
local res, err = httpc:request_uri("http://sidecar:8000/auth", { method = "POST", body = body, headers = { ["Content-Type"] = "application/json" } })
-- Handle failures if not res then if FAIL_OPEN then return -- Allow traffic else core.response.exit(503, "Security service unavailable") end end
-- Check result if res.status == 403 then core.response.exit(403, "Blocked by security") end endPerformance Considerations
Section titled βPerformance ConsiderationsβBy using SafeLLM as a sidecar:
- Low Latency: Communication happens over localhost (~0.1ms network overhead).
- Scalability: SafeLLM instances scale horizontally alongside APISIX.
- Security: Prompts are checked before leaving your infrastructure.
Alternative: Proxy Mode
Section titled βAlternative: Proxy ModeβIf Lua integration is too complex for your setup, you can run SafeLLM as a reverse proxy in front of APISIX:
Client β SafeLLM (proxy) β APISIX β LLMThis gives SafeLLM direct access to the body without Lua, but you lose some APISIX benefits (like early rate limiting).
Reference Deployment
Section titled βReference DeploymentβFor a ready-to-run APISIX evaluation stack:
- Docs:
/deployments/apisix-reference/ - Repo folder:
safellm-oss/examples/apisix-reference/
Long-form companion guides:
/articles/how-to-run-apisix-reference-with-safellm//articles/why-apisix-for-ai-gateway/