Skip to main content
Version: 1.x

Interface: AiProviderConfig

Configuration for an AI provider connection.

Intent

Supply connection details to LLM service factories.

Capability

pramanAI.llm

Example

const config: AiProviderConfig = {
provider: 'azure-openai',
model: 'gpt-4o',
endpoint: 'https://my-resource.openai.azure.com/',
temperature: 0.2,
maxTokens: 4096,
};

Properties

anthropicApiKey?

readonly optional anthropicApiKey?: string

API key for Anthropic.


apiKey?

readonly optional apiKey?: string

API key for OpenAI / Azure OpenAI.


endpoint?

readonly optional endpoint?: string

Base endpoint URL (required for Azure OpenAI deployments).


maxTokens?

readonly optional maxTokens?: number

Maximum tokens for the completion (provider default if omitted).


model

readonly model: string

Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).


provider

readonly provider: "openai" | "azure-openai" | "anthropic"

LLM provider identifier.


temperature

readonly temperature: number

Sampling temperature (0.0–2.0). Lower = more deterministic.