Using Octelium as a scalable, secure AI Gateway proving secure zero trust client-based as well as client-less BeyondCorp access for both HUMAN
as well as WORKLOAD
Users (read more here) to any AI LLM provider is usually seamless. This is a simple example where you can have a Gemini API Service, publicly exposed (read more about the public client-less BeyondCorp mode here).
First we need to create a Secret for the CockroachDB database's password as follows:
octeliumctl create secret gemini-api-key
Now we create the Service for the Gemini API as follows:
1kind: Service2metadata:3name: gemini4spec:5mode: HTTP6isPublic: true7config:8upstream:9url: https://generativelanguage.googleapis.com10http:11path:12addPrefix: /v1beta/openai13auth:14bearer:15fromSecret: gemini-api-key
You can now apply the creation of the Service as follows (read more here):
octeliumctl apply /PATH/TO/SERVICE.YAML
Octelium enables authorized Users (read more about access control here) to access the Service both via the client-based mode as well as publicly via the client-less BeyondCorp mode (read more here). In this guide, we are going to use the client-less mode to access the Service via the standard OAuth2 client credentials in order for your workloads that can be written in any programming language to access the Service without having to use any special SDKs or have access to external clients All you need is to create an OAUTH2
Credential as illustrated here. Now, here is an example written in Typescript:
1import OpenAI from "openai";23import { OAuth2Client } from "@badgateway/oauth2-client";45async function main() {6const oauth2Client = new OAuth2Client({7server: "https://<DOMAIN>/",8clientId: "spxg-cdyx",9clientSecret: "AQpAzNmdEcPIfWYR2l2zLjMJm....",10tokenEndpoint: "/oauth2/token",11authenticationMethod: "client_secret_post",12});1314const oauth2Creds = await oauth2Client.clientCredentials();1516const client = new OpenAI({17apiKey: oauth2Creds.accessToken,18baseURL: "https://gemini.<DOMAIN>",19});2021const chatCompletion = await client.chat.completions.create({22messages: [23{ role: "user", content: "How do I write a Golang HTTP reverse proxy?" },24],25model: "gemini-2.0-flash",26});2728console.log("Result", chatCompletion);29}
You can also route to a certain LLM provider based on the content of the request body (read more about dynamic configuration here), here is an example:
1kind: Service2metadata:3name: total-ai4spec:5mode: HTTP6isPublic: true7dynamicConfig:8configs:9- name: gemini10upstream:11url: https://generativelanguage.googleapis.com12http:13path:14addPrefix: /v1beta/openai15auth:16bearer:17fromSecret: gemini-api-key18- name: openai19upstream:20url: https://api.openai.com21http:22path:23addPrefix: /v124auth:25bearer:26fromSecret: openai-api-key27- name: deepseek28upstream:29url: https://api.deepseek.com30http:31path:32addPrefix: /v133auth:34bearer:35fromSecret: deepseek-api-key36rules:37- condition:38match: ctx.request.http.bodyMap.model == "gpt-4o-mini"39configName: openai40- condition:41match: ctx.request.http.bodyMap.model == "deepseek-chat"42configName: deepseek43- condition:44matchAny: true45configName: gemini
For more complex and dynamic routing rules (e.g. message-based routing), you can use the full power of Open Policy Agent (OPA) (read more here).
Here are a few more features that you might be interested in:
- Routing not just by request paths, but also by header keys and values, request body content including JSON (read more here).
- Request/response header manipulation (read more here).
- Cross-Origin Resource Sharing (CORS) (read more here).
- Secret-less access to upstreams and injecting bearer, basic, or custom authentication header credentials (read more here).
- Application layer-aware ABAC access control via policy-as-code using CEL and Open Policy Agent (read more here).