• Docs
  • Docs
Quick Start
Agent setup
Claude Code setup
Codex setup
Cursor setup
Visual Studio Code Copilot setup
Web browser proxy setup
LangGraph / LangChain
SDK setup
Vertex AI

SDK setup

Bring aiproxy into the loop for any SDK or automation that talks to LLM providers. These guides focus on first-party libraries so you can observe and govern traffic without rewriting every integration.

aiproxy offers two integration patterns:

  • Forward proxy mode – point HTTP(S)_PROXY at http://localhost:8080 so every TLS call flows through aiproxy
  • Router endpoints – skip proxy plumbing and change the SDK base URL to a specific endpoint such as /vertex, /openai, or /anthropic

Before you start

  • Run aiproxy locally (examples use http://localhost:8080)
  • Install and trust the aiproxy certificate authority from ~/.aiproxy/aiproxy-ca-cert.pem (see Quick Start → TLS certificate)
  • Note the proxy host/port plus any authentication you configured
  • Keep provider-specific credentials handy (gcloud auth application-default login, OpenAI keys, etc.)

SDK guides

  • Vertex AI SDK – Capture Gemini traffic as well as Anthropic-on-Vertex requests using the official Python clients

Share feedback in the repo if a guide is missing—the same proxy primitives apply to most HTTP or gRPC-based SDKs.

    On this page

  • Before you start
  • SDK guides