Bring aiproxy into the loop for any SDK or automation that talks to LLM providers. These guides focus on first-party libraries so you can observe and govern traffic without rewriting every integration.
aiproxy offers two integration patterns:
- Forward proxy mode – point
HTTP(S)_PROXYathttp://localhost:8080so every TLS call flows throughaiproxy - Router endpoints – skip proxy plumbing and change the SDK base URL to a specific endpoint such as
/vertex,/openai, or/anthropic
- Run
aiproxylocally (examples usehttp://localhost:8080) - Install and trust the
aiproxycertificate authority from~/.aiproxy/aiproxy-ca-cert.pem(see Quick Start → TLS certificate) - Note the proxy host/port plus any authentication you configured
- Keep provider-specific credentials handy (
gcloud auth application-default login, OpenAI keys, etc.)
- Vertex AI SDK – Capture Gemini traffic as well as Anthropic-on-Vertex requests using the official Python clients
Share feedback in the repo if a guide is missing—the same proxy primitives apply to most HTTP or gRPC-based SDKs.