Interface: LlmService
Defined in: src/ai/llm-service.ts:63
LLM provider service interface.
Remarks
All methods return AiResponse envelopes — never throw on API errors.
Only createLlmService() throws (when the provider is not configured).
Intent
Uniform interface over Azure OpenAI, OpenAI, and Anthropic
Methods
chat()
chat(
messages,schema):Promise<AiResponse<unknown>>
Defined in: src/ai/llm-service.ts:83
Send a multi-turn conversation and receive a structured response.
Parameters
messages
object[]
Ordered list of conversation turns
schema
ZodType
Zod schema to validate the JSON response
Returns
Promise<AiResponse<unknown>>
Validated response or error envelope
close()
close():
Promise<void>
Defined in: src/ai/llm-service.ts:101
Close the LLM connection and release resources.
Returns
Promise<void>
Remarks
Called automatically in the pramanAI fixture teardown. Safe to call
multiple times — idempotent.
complete()
complete(
prompt,schema):Promise<AiResponse<unknown>>
Defined in: src/ai/llm-service.ts:74
Send a single prompt and receive a structured response.
Parameters
prompt
string
Natural language instruction for the LLM
schema
ZodType
Zod schema to validate the JSON response
Returns
Promise<AiResponse<unknown>>
Validated response or error envelope
Remarks
Internally constructs a single user message and delegates to chat.
isConfigured()
isConfigured():
boolean
Defined in: src/ai/llm-service.ts:92
Return whether the AI provider is configured.
Returns
boolean
Remarks
Returns false gracefully when config.ai is undefined. Never throws.
Use this to degrade gracefully when AI is not available.