Interface: AiProviderConfig
Defined in: src/ai/types.ts:152
Configuration for an AI provider connection.
Intent
Supply connection details to LLM service factories.
Capability
AI provider selection, model routing.
Example
const config: AiProviderConfig = {
provider: 'azure-openai',
model: 'gpt-4o',
endpoint: 'https://my-resource.openai.azure.com/',
temperature: 0.2,
maxTokens: 4096,
};
Properties
anthropicApiKey?
readonlyoptionalanthropicApiKey:string
Defined in: src/ai/types.ts:158
API key for Anthropic.
apiKey?
readonlyoptionalapiKey:string
Defined in: src/ai/types.ts:156
API key for OpenAI / Azure OpenAI.
endpoint?
readonlyoptionalendpoint:string
Defined in: src/ai/types.ts:162
Base endpoint URL (required for Azure OpenAI deployments).
maxTokens?
readonlyoptionalmaxTokens:number
Defined in: src/ai/types.ts:166
Maximum tokens for the completion (provider default if omitted).
model
readonlymodel:string
Defined in: src/ai/types.ts:160
Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).
provider
readonlyprovider:"openai"|"azure-openai"|"anthropic"
Defined in: src/ai/types.ts:154
LLM provider identifier.
temperature
readonlytemperature:number
Defined in: src/ai/types.ts:164
Sampling temperature (0.0–2.0). Lower = more deterministic.