Skip to main content
Version: 1.x

Interface: AiProviderConfig

Defined in: src/ai/types.ts:152

Configuration for an AI provider connection.

Intent

Supply connection details to LLM service factories.

Capability

AI provider selection, model routing.

Example

const config: AiProviderConfig = {
provider: 'azure-openai',
model: 'gpt-4o',
endpoint: 'https://my-resource.openai.azure.com/',
temperature: 0.2,
maxTokens: 4096,
};

Properties

anthropicApiKey?

readonly optional anthropicApiKey: string

Defined in: src/ai/types.ts:158

API key for Anthropic.


apiKey?

readonly optional apiKey: string

Defined in: src/ai/types.ts:156

API key for OpenAI / Azure OpenAI.


endpoint?

readonly optional endpoint: string

Defined in: src/ai/types.ts:162

Base endpoint URL (required for Azure OpenAI deployments).


maxTokens?

readonly optional maxTokens: number

Defined in: src/ai/types.ts:166

Maximum tokens for the completion (provider default if omitted).


model

readonly model: string

Defined in: src/ai/types.ts:160

Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).


provider

readonly provider: "openai" | "azure-openai" | "anthropic"

Defined in: src/ai/types.ts:154

LLM provider identifier.


temperature

readonly temperature: number

Defined in: src/ai/types.ts:164

Sampling temperature (0.0–2.0). Lower = more deterministic.