Agents 1.4.0
Edge AI Agents SDK
Loading...
Searching...
No Matches
agents::llms::OpenAILLM Class Reference

Implementation of LLMInterface for OpenAI models. More...

#include <openai_llm.h>

Inheritance diagram for agents::llms::OpenAILLM:
agents::LLMInterface

Public Member Functions

 OpenAILLM (const std::string &api_key="", const std::string &model="gpt-4.1")
 Constructor.
 ~OpenAILLM () override=default
 Destructor.
std::vector< std::string > getAvailableModels () override
 Get available models from OpenAI.
void setModel (const std::string &model) override
 Set the model to use.
std::string getModel () const override
 Get current model.
void setApiKey (const std::string &api_key) override
 Set API key.
void setApiBase (const std::string &api_base) override
 Set API base URL (for self-hosted or proxied endpoints).
void setOptions (const LLMOptions &options) override
 Set options for API calls.
LLMOptions getOptions () const override
 Get current options.
LLMResponse chat (const std::string &prompt) override
 Generate completion from a prompt.
LLMResponse chat (const std::vector< Message > &messages) override
 Generate completion from a list of messages.
LLMResponse chatWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) override
 Generate completion with available tools.
void streamChat (const std::vector< Message > &messages, std::function< void(const std::string &, bool)> callback) override
 Stream results with callback.
AsyncGenerator< std::string > streamChatAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) override
 Stream chat with AsyncGenerator.
virtual Task< LLMResponsechatAsync (const std::vector< Message > &messages)
 Async chat from a list of messages.
virtual Task< LLMResponsechatWithToolsAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)
 Async chat with tools.
virtual AsyncGenerator< std::pair< std::string, ToolCalls > > streamChatAsyncWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)
 Stream chat with tools using AsyncGenerator.
virtual std::optional< JsonObjectuploadMediaFile (const std::string &local_path, const std::string &mime, const std::string &binary="")
 Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported.

Detailed Description

Implementation of LLMInterface for OpenAI models.

Constructor & Destructor Documentation

◆ OpenAILLM()

agents::llms::OpenAILLM::OpenAILLM ( const std::string & api_key = "",
const std::string & model = "gpt-4.1" )

Constructor.

Parameters
api_keyThe API key
modelThe model to use

Member Function Documentation

◆ chat() [1/2]

LLMResponse agents::llms::OpenAILLM::chat ( const std::string & prompt)
overridevirtual

Generate completion from a prompt.

Parameters
promptThe prompt
Returns
The completion

Implements agents::LLMInterface.

◆ chat() [2/2]

LLMResponse agents::llms::OpenAILLM::chat ( const std::vector< Message > & messages)
overridevirtual

Generate completion from a list of messages.

Parameters
messagesThe messages
Returns
The completion

Implements agents::LLMInterface.

◆ chatAsync()

virtual Task< LLMResponse > agents::LLMInterface::chatAsync ( const std::vector< Message > & messages)
virtualinherited

Async chat from a list of messages.

Parameters
messagesThe messages to generate completion from
Returns
The LLM response

◆ chatWithTools()

LLMResponse agents::llms::OpenAILLM::chatWithTools ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
overridevirtual

Generate completion with available tools.

Parameters
messagesThe messages
toolsThe tools
Returns
The completion

Implements agents::LLMInterface.

◆ chatWithToolsAsync()

virtual Task< LLMResponse > agents::LLMInterface::chatWithToolsAsync ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
virtualinherited

Async chat with tools.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The LLM response

◆ getAvailableModels()

std::vector< std::string > agents::llms::OpenAILLM::getAvailableModels ( )
overridevirtual

Get available models from OpenAI.

Returns
The available models

Implements agents::LLMInterface.

◆ getModel()

std::string agents::llms::OpenAILLM::getModel ( ) const
overridevirtual

Get current model.

Returns
The current model

Implements agents::LLMInterface.

◆ getOptions()

LLMOptions agents::llms::OpenAILLM::getOptions ( ) const
overridevirtual

Get current options.

Returns
The current options

Implements agents::LLMInterface.

◆ setApiBase()

void agents::llms::OpenAILLM::setApiBase ( const std::string & api_base)
overridevirtual

Set API base URL (for self-hosted or proxied endpoints).

Parameters
api_baseThe API base URL

Implements agents::LLMInterface.

◆ setApiKey()

void agents::llms::OpenAILLM::setApiKey ( const std::string & api_key)
overridevirtual

Set API key.

Parameters
api_keyThe API key

Implements agents::LLMInterface.

◆ setModel()

void agents::llms::OpenAILLM::setModel ( const std::string & model)
overridevirtual

Set the model to use.

Parameters
modelThe model to use

Implements agents::LLMInterface.

◆ setOptions()

void agents::llms::OpenAILLM::setOptions ( const LLMOptions & options)
overridevirtual

Set options for API calls.

Parameters
optionsThe options

Implements agents::LLMInterface.

◆ streamChat()

void agents::llms::OpenAILLM::streamChat ( const std::vector< Message > & messages,
std::function< void(const std::string &, bool)> callback )
overridevirtual

Stream results with callback.

Parameters
messagesThe messages
callbackThe callback

Implements agents::LLMInterface.

◆ streamChatAsync()

AsyncGenerator< std::string > agents::llms::OpenAILLM::streamChatAsync ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
overridevirtual

Stream chat with AsyncGenerator.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The AsyncGenerator of response chunks

Reimplemented from agents::LLMInterface.

◆ streamChatAsyncWithTools()

virtual AsyncGenerator< std::pair< std::string, ToolCalls > > agents::LLMInterface::streamChatAsyncWithTools ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
virtualinherited

Stream chat with tools using AsyncGenerator.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The AsyncGenerator of response chunks and tool calls

Reimplemented in agents::llms::GoogleLLM, and agents::llms::OllamaLLM.

◆ uploadMediaFile()

virtual std::optional< JsonObject > agents::LLMInterface::uploadMediaFile ( const std::string & local_path,
const std::string & mime,
const std::string & binary = "" )
virtualinherited

Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported.

Parameters
local_pathLocal filesystem path
mimeThe MIME type of the media file
binaryOptional binary content of the media file
Returns
Optional envelope; std::nullopt if unsupported

Reimplemented in agents::llms::AnthropicLLM, and agents::llms::GoogleLLM.