Skip to main content
BrowserWire uses a vision-capable LLM to understand what’s on the page as you explore a site. Without a configured provider, discovery will not run.

Supported providers

ProviderDefault modelRequires API key
openaigpt-4oYes
anthropicclaude-sonnet-4-20250514Yes
geminigemini-2.5-flashYes
ollamallama3No

Set your provider

You can configure the LLM through environment variables, a .env file, or a config file. Environment variables and .env take precedence over the config file.
Set these before starting the server:
export BROWSERWIRE_LLM_PROVIDER=openai
export BROWSERWIRE_LLM_API_KEY=sk-...
You can also place them in a .env file in your working directory — BrowserWire loads it automatically on startup.

Per-provider setup

export BROWSERWIRE_LLM_PROVIDER=openai
export BROWSERWIRE_LLM_API_KEY=sk-...
The default model is gpt-4o. You can override it with BROWSERWIRE_LLM_MODEL.

Override the model

To use a model other than the provider’s default, set BROWSERWIRE_LLM_MODEL:
export BROWSERWIRE_LLM_PROVIDER=openai
export BROWSERWIRE_LLM_API_KEY=sk-...
export BROWSERWIRE_LLM_MODEL=gpt-4o-mini
Or in ~/.browserwire/config.json:
{
  "llmProvider": "openai",
  "llmApiKey": "sk-...",
  "llmModel": "gpt-4o-mini"
}
BrowserWire requires a model with vision capabilities. Using a text-only model will cause discovery to fail.

Use a custom base URL

BROWSERWIRE_LLM_BASE_URL overrides the API endpoint for the selected provider. This is useful for Ollama with a non-default address, or when routing through a proxy:
# Ollama on a different host or port
export BROWSERWIRE_LLM_PROVIDER=ollama
export BROWSERWIRE_LLM_BASE_URL=http://192.168.1.50:11434

# OpenAI-compatible proxy
export BROWSERWIRE_LLM_PROVIDER=openai
export BROWSERWIRE_LLM_API_KEY=sk-...
export BROWSERWIRE_LLM_BASE_URL=https://my-proxy.example.com/v1