Class: OpenAI
Hierarchy
-
ToolCallLLM
<OpenAIAdditionalChatOptions
>↳
OpenAI
↳↳
FireworksLLM
↳↳
Groq
↳↳
TogetherLLM
Constructors
constructor
• new OpenAI(init?
): OpenAI
Parameters
Name | Type |
---|---|
init? | Partial <OpenAI > & { azure? : AzureOpenAIConfig } |
Returns
Overrides
ToolCallLLM<OpenAIAdditionalChatOptions>.constructor
Defined in
packages/core/src/llm/openai.ts:179
Properties
additionalChatOptions
• Optional
additionalChatOptions: OpenAIAdditionalChatOptions
Defined in
packages/core/src/llm/openai.ts:167
additionalSessionOptions
• Optional
additionalSessionOptions: Omit
<Partial
<ClientOptions
>, "apiKey"
| "timeout"
| "maxRetries"
>
Defined in
packages/core/src/llm/openai.ts:174
apiKey
• Optional
apiKey: string
= undefined
Defined in
packages/core/src/llm/openai.ts:170
maxRetries
• maxRetries: number
Defined in
packages/core/src/llm/openai.ts:171
maxTokens
• Optional
maxTokens: number
Defined in
packages/core/src/llm/openai.ts:166
model
• model: string
Defined in
packages/core/src/llm/openai.ts:163
session
• session: OpenAISession
Defined in
packages/core/src/llm/openai.ts:173
temperature
• temperature: number
Defined in
packages/core/src/llm/openai.ts:164
timeout
• Optional
timeout: number
Defined in
packages/core/src/llm/openai.ts:172
topP
• topP: number
Defined in
packages/core/src/llm/openai.ts:165
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Overrides
ToolCallLLM.metadata
Defined in
packages/core/src/llm/openai.ts:236
supportToolCall
• get
supportToolCall(): boolean
Returns
boolean
Overrides
ToolCallLLM.supportToolCall
Defined in
packages/core/src/llm/openai.ts:232