LLM Proxy

3 min read

LLM Proxy is Archestra's security layer that sits between AI agents and LLM providers (OpenAI, Anthropic, Google, etc.). It intercepts, analyzes, and modifies LLM requests and responses to enforce security policies, prevent data leakage, and ensure compliance with organizational guidelines.

To use LLM Proxy:

Go to "Profiles" -> Connect Icon -> You'll get connection instructions.

External Agent Identification

When multiple applications share the same Profile, you can use the X-Archestra-Agent-Id header to identify which application each request originates from. This allows you to:

  • Reuse a single Profile across multiple applications while maintaining distinct tracking
  • Filter logs by application in the LLM Proxy Logs viewer
  • Segment metrics by application in your observability dashboards (Prometheus, Grafana, etc.)

Usage

Include the header in your LLM requests:

curl -X POST "https://your-archestra-instance/v1/openai/chat/completions" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "X-Archestra-Agent-Id: my-chatbot-prod" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

The external agent ID will be:

  • Stored with each interaction in the database
  • Displayed in the LLM Proxy Logs table (filterable)
  • Included in Prometheus metrics as the agent_id label
  • Available in the interaction detail page

Example Use Cases

ScenarioProfileX-Archestra-Agent-Id
Multiple environmentscustomer-supportcustomer-support-prod, customer-support-staging
Multiple applicationsshared-assistantmobile-app, web-app, slack-bot
Per-customer trackingmulti-tenant-botcustomer-123, customer-456

This approach lets you maintain centralized security policies through Profiles while still having granular visibility into which applications are generating traffic.

User Identification

You can use the X-Archestra-User-Id header to associate LLM requests with a specific Archestra user. This is particularly useful for:

  • Tracking user activity in the LLM Proxy Logs viewer
  • Identifying which user made a request from the Archestra Chat
  • Auditing and compliance purposes

Usage

Include the header in your LLM requests with the Archestra user's UUID:

curl -X POST "https://your-archestra-instance/v1/openai/chat/completions" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "X-Archestra-User-Id: 123e4567-e89b-12d3-a456-426614174000" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Archestra Chat Integration

When using the built-in Archestra Chat, the X-Archestra-User-Id header is automatically included in all requests, allowing you to see which team member initiated each conversation in the logs.