Skip to content

feat: add headers field to ModelConfigParam#321

Open
shubh24 wants to merge 1 commit intomainfrom
feat/add-headers-to-model-config
Open

feat: add headers field to ModelConfigParam#321
shubh24 wants to merge 1 commit intomainfrom
feat/add-headers-to-model-config

Conversation

@shubh24
Copy link
Contributor

@shubh24 shubh24 commented Mar 21, 2026

Summary

  • Adds optional headers: Dict[str, str] field to ModelConfigParam in the Python SDK
  • Enables passing custom HTTP headers to the LLM provider on per-operation model configs (act, extract, observe, execute)
  • Aligns the Python SDK with the core TypeScript SDK, which already supports headers on both ClientOptions and ModelConfigObjectSchema

Context

The TypeScript core supports custom headers on LLM requests — ClientOptions in model.ts has headers?: Record<string, string>, and ModelConfigObjectSchema in api.ts has headers: z.record(z.string(), z.string()).optional(). The Python SDK's ModelConfigParam was missing this field, preventing users who route LLM calls through a proxy (via baseURL) from passing required auth headers.

What was tested

  1. baseURL passthrough (Python SDK → SEA server): Verified baseURL + apiKey on ModelConfigParam works end-to-end — act() and extract() both succeeded
  2. headers passthrough (local SEA server): Spun up a local SEA server (packages/server) and tested act() with headers in model config — confirmed headers flow through resolveLlmClient()getClient()getAISDKLanguageModel()createOpenAI(clientOptions), where @ai-sdk/openai natively accepts headers
  3. Code path verification: Traced the full flow from SDK → server route → V3 resolveLlmClient()LLMProvider.getAISDKLanguageModel() → AI SDK provider creation. The ...rest spread in resolveLlmClient() passes all ClientOptions fields (including headers) directly through to the provider creator

Usage

session.act(
    input="click the login button",
    options={
        "model": {
            "modelName": "openai/gpt-4o",
            "apiKey": "...",
            "baseURL": "https://my-proxy.example.com/v1",
            "headers": {
                "X-Auth-Token": "...",
                "X-Custom-Tag": "my-app"
            }
        }
    }
)

Works on act(), extract(), observe(), and execute() — all param types import ModelConfigParam.

Test plan

  • Verify headers field is accepted without error on act(), extract(), observe(), execute()
  • Verify headers are forwarded to the LLM provider endpoint (observable via proxy logs)
  • Verify backward compatibility — omitting headers continues to work as before

🤖 Generated with Claude Code

Adds optional `headers` field to `ModelConfigParam`, allowing users to
pass custom HTTP headers with every request to their model provider.
This aligns the Python SDK with the core TypeScript SDK which already
supports `headers` on `ClientOptions` and `ModelConfigObjectSchema`.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@shubh24 shubh24 requested a review from miguelg719 March 21, 2026 06:44
Copy link

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 1 file

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

Since this is your first cubic review, here's how it works:

  • cubic automatically reviews your code and comments on bugs and improvements
  • Teach cubic by replying to its comments. cubic learns from your replies and gets better over time
  • Add one-off context when rerunning by tagging @cubic-dev-ai with guidance or docs links (including llms.txt)
  • Ask questions if you need clarification on any suggestion

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant