Interface: LLM<AdditionalChatOptions, AdditionalMessageOptions>
Unified language model interface
Type parameters
Name | Type |
---|---|
AdditionalChatOptions | extends object = object |
AdditionalMessageOptions | extends object = object |
Hierarchy
-
LLMChat
<AdditionalChatOptions
>↳
LLM
Implemented by
Properties
metadata
• metadata: LLMMetadata
Defined in
packages/core/src/llm/types.ts:49
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
<object
>>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <AdditionalChatOptions , AdditionalMessageOptions > |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
<object
>>>
Overrides
Defined in
packages/core/src/llm/types.ts:53
▸ chat(params
): Promise
<ChatResponse
<AdditionalMessageOptions
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <AdditionalChatOptions , AdditionalMessageOptions > |
Returns
Promise
<ChatResponse
<AdditionalMessageOptions
>>
Overrides
LLMChat.chat
Defined in
packages/core/src/llm/types.ts:59
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Defined in
packages/core/src/llm/types.ts:69
▸ complete(params
): Promise
<CompletionResponse
>
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsNonStreaming |
Returns
Promise
<CompletionResponse
>