Skip to main content

Interface: LLM

Unified language model interface

Implemented by

Properties

hasStreaming

hasStreaming: boolean

Defined in

packages/core/src/llm/LLM.ts:68


metadata

metadata: LLMMetadata

Defined in

packages/core/src/llm/LLM.ts:66

Methods

chat

chat<T, R>(messages, parentEvent?, streaming?): Promise<R>

Get a chat response from the LLM

Type parameters

NameType
Textends undefined | boolean = undefined
RT extends true ? AsyncGenerator<string, void, unknown> : ChatResponse

Parameters

NameTypeDescription
messagesChatMessage[]The return type of chat() and complete() are set by the "streaming" parameter being set to True.
parentEvent?Event-
streaming?T-

Returns

Promise<R>

Defined in

packages/core/src/llm/LLM.ts:75


complete

complete<T, R>(prompt, parentEvent?, streaming?): Promise<R>

Get a prompt completion from the LLM

Type parameters

NameType
Textends undefined | boolean = undefined
RT extends true ? AsyncGenerator<string, void, unknown> : ChatResponse

Parameters

NameTypeDescription
promptstringthe prompt to complete
parentEvent?Event-
streaming?T-

Returns

Promise<R>

Defined in

packages/core/src/llm/LLM.ts:88


tokens

tokens(messages): number

Calculates the number of tokens needed for the given chat messages

Parameters

NameType
messagesChatMessage[]

Returns

number

Defined in

packages/core/src/llm/LLM.ts:100