|
Agents 1.4.0
Edge AI Agents SDK
|
Implementation of LLMInterface for OpenAI models. More...
#include <openai_llm.h>
Public Member Functions | |
| OpenAILLM (const std::string &api_key="", const std::string &model="gpt-4.1") | |
| Constructor. | |
| ~OpenAILLM () override=default | |
| Destructor. | |
| std::vector< std::string > | getAvailableModels () override |
| Get available models from OpenAI. | |
| void | setModel (const std::string &model) override |
| Set the model to use. | |
| std::string | getModel () const override |
| Get current model. | |
| void | setApiKey (const std::string &api_key) override |
| Set API key. | |
| void | setApiBase (const std::string &api_base) override |
| Set API base URL (for self-hosted or proxied endpoints). | |
| void | setOptions (const LLMOptions &options) override |
| Set options for API calls. | |
| LLMOptions | getOptions () const override |
| Get current options. | |
| LLMResponse | chat (const std::string &prompt) override |
| Generate completion from a prompt. | |
| LLMResponse | chat (const std::vector< Message > &messages) override |
| Generate completion from a list of messages. | |
| LLMResponse | chatWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) override |
| Generate completion with available tools. | |
| void | streamChat (const std::vector< Message > &messages, std::function< void(const std::string &, bool)> callback) override |
| Stream results with callback. | |
| AsyncGenerator< std::string > | streamChatAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) override |
| Stream chat with AsyncGenerator. | |
| virtual Task< LLMResponse > | chatAsync (const std::vector< Message > &messages) |
| Async chat from a list of messages. | |
| virtual Task< LLMResponse > | chatWithToolsAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) |
| Async chat with tools. | |
| virtual AsyncGenerator< std::pair< std::string, ToolCalls > > | streamChatAsyncWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools) |
| Stream chat with tools using AsyncGenerator. | |
| virtual std::optional< JsonObject > | uploadMediaFile (const std::string &local_path, const std::string &mime, const std::string &binary="") |
| Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported. | |
Implementation of LLMInterface for OpenAI models.
| agents::llms::OpenAILLM::OpenAILLM | ( | const std::string & | api_key = "", |
| const std::string & | model = "gpt-4.1" ) |
Constructor.
| api_key | The API key |
| model | The model to use |
|
overridevirtual |
Generate completion from a prompt.
| prompt | The prompt |
Implements agents::LLMInterface.
|
overridevirtual |
Generate completion from a list of messages.
| messages | The messages |
Implements agents::LLMInterface.
|
virtualinherited |
Async chat from a list of messages.
| messages | The messages to generate completion from |
|
overridevirtual |
Generate completion with available tools.
| messages | The messages |
| tools | The tools |
Implements agents::LLMInterface.
|
virtualinherited |
Async chat with tools.
| messages | The messages to generate completion from |
| tools | The tools to use |
|
overridevirtual |
|
overridevirtual |
|
overridevirtual |
|
overridevirtual |
Set API base URL (for self-hosted or proxied endpoints).
| api_base | The API base URL |
Implements agents::LLMInterface.
|
overridevirtual |
|
overridevirtual |
|
overridevirtual |
|
overridevirtual |
Stream results with callback.
| messages | The messages |
| callback | The callback |
Implements agents::LLMInterface.
|
overridevirtual |
Stream chat with AsyncGenerator.
| messages | The messages to generate completion from |
| tools | The tools to use |
Reimplemented from agents::LLMInterface.
|
virtualinherited |
Stream chat with tools using AsyncGenerator.
| messages | The messages to generate completion from |
| tools | The tools to use |
Reimplemented in agents::llms::GoogleLLM, and agents::llms::OllamaLLM.
|
virtualinherited |
Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported.
| local_path | Local filesystem path |
| mime | The MIME type of the media file |
| binary | Optional binary content of the media file |
Reimplemented in agents::llms::AnthropicLLM, and agents::llms::GoogleLLM.