Agents 1.4.0
Edge AI Agents SDK
Loading...
Searching...
No Matches
agents::LLMInterface Class Referenceabstract

Interface for language model providers (OpenAI, Anthropic, Google, Ollama). More...

#include <llm_interface.h>

Inheritance diagram for agents::LLMInterface:
agents::llms::AnthropicLLM agents::llms::GoogleLLM agents::llms::OllamaLLM agents::llms::OpenAILLM

Public Member Functions

virtual ~LLMInterface ()=default
 Destructor.
virtual std::vector< std::string > getAvailableModels ()=0
 Get available models from this provider.
virtual void setModel (const std::string &model)=0
 Set the model to use.
virtual std::string getModel () const =0
 Get current model.
virtual void setApiKey (const std::string &api_key)=0
 Set API key.
virtual void setApiBase (const std::string &api_base)=0
 Set API base URL (for self-hosted or proxied endpoints).
virtual void setOptions (const LLMOptions &options)=0
 Set options for API calls.
virtual LLMOptions getOptions () const =0
 Get current options.
virtual LLMResponse chat (const std::string &prompt)=0
 Generate completion from a prompt.
virtual LLMResponse chat (const std::vector< Message > &messages)=0
 Generate completion from a list of messages.
virtual LLMResponse chatWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)=0
 Generate completion with available tools.
virtual void streamChat (const std::vector< Message > &messages, std::function< void(const std::string &, bool)> callback)=0
 Stream results with callback.
virtual Task< LLMResponsechatAsync (const std::vector< Message > &messages)
 Async chat from a list of messages.
virtual Task< LLMResponsechatWithToolsAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)
 Async chat with tools.
virtual AsyncGenerator< std::string > streamChatAsync (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)
 Stream chat with AsyncGenerator.
virtual AsyncGenerator< std::pair< std::string, ToolCalls > > streamChatAsyncWithTools (const std::vector< Message > &messages, const std::vector< std::shared_ptr< Tool > > &tools)
 Stream chat with tools using AsyncGenerator.
virtual std::optional< JsonObjectuploadMediaFile (const std::string &local_path, const std::string &mime, const std::string &binary="")
 Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported.

Detailed Description

Interface for language model providers (OpenAI, Anthropic, Google, Ollama).

Member Function Documentation

◆ chat() [1/2]

virtual LLMResponse agents::LLMInterface::chat ( const std::string & prompt)
pure virtual

Generate completion from a prompt.

Parameters
promptThe prompt
Returns
The completion

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ chat() [2/2]

virtual LLMResponse agents::LLMInterface::chat ( const std::vector< Message > & messages)
pure virtual

Generate completion from a list of messages.

Parameters
messagesThe messages to generate completion from
Returns
The LLM response

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ chatAsync()

virtual Task< LLMResponse > agents::LLMInterface::chatAsync ( const std::vector< Message > & messages)
virtual

Async chat from a list of messages.

Parameters
messagesThe messages to generate completion from
Returns
The LLM response

◆ chatWithTools()

virtual LLMResponse agents::LLMInterface::chatWithTools ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
pure virtual

Generate completion with available tools.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The LLM response

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ chatWithToolsAsync()

virtual Task< LLMResponse > agents::LLMInterface::chatWithToolsAsync ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
virtual

Async chat with tools.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The LLM response

◆ getAvailableModels()

virtual std::vector< std::string > agents::LLMInterface::getAvailableModels ( )
pure virtual

Get available models from this provider.

Returns
The available models

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ getModel()

virtual std::string agents::LLMInterface::getModel ( ) const
pure virtual

Get current model.

Returns
The current model

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ getOptions()

virtual LLMOptions agents::LLMInterface::getOptions ( ) const
pure virtual

Get current options.

Returns
The current options

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ setApiBase()

virtual void agents::LLMInterface::setApiBase ( const std::string & api_base)
pure virtual

Set API base URL (for self-hosted or proxied endpoints).

Parameters
api_baseThe API base URL to use

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ setApiKey()

virtual void agents::LLMInterface::setApiKey ( const std::string & api_key)
pure virtual

Set API key.

Parameters
api_keyThe API key to use

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ setModel()

virtual void agents::LLMInterface::setModel ( const std::string & model)
pure virtual

Set the model to use.

Parameters
modelThe model to use

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ setOptions()

virtual void agents::LLMInterface::setOptions ( const LLMOptions & options)
pure virtual

Set options for API calls.

Parameters
optionsThe options to use

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ streamChat()

virtual void agents::LLMInterface::streamChat ( const std::vector< Message > & messages,
std::function< void(const std::string &, bool)> callback )
pure virtual

Stream results with callback.

Parameters
messagesThe messages to generate completion from
callbackThe callback to use

Implemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ streamChatAsync()

virtual AsyncGenerator< std::string > agents::LLMInterface::streamChatAsync ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
virtual

Stream chat with AsyncGenerator.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The AsyncGenerator of response chunks

Reimplemented in agents::llms::AnthropicLLM, agents::llms::GoogleLLM, agents::llms::OllamaLLM, and agents::llms::OpenAILLM.

◆ streamChatAsyncWithTools()

virtual AsyncGenerator< std::pair< std::string, ToolCalls > > agents::LLMInterface::streamChatAsyncWithTools ( const std::vector< Message > & messages,
const std::vector< std::shared_ptr< Tool > > & tools )
virtual

Stream chat with tools using AsyncGenerator.

Parameters
messagesThe messages to generate completion from
toolsThe tools to use
Returns
The AsyncGenerator of response chunks and tool calls

Reimplemented in agents::llms::GoogleLLM, and agents::llms::OllamaLLM.

◆ uploadMediaFile()

virtual std::optional< JsonObject > agents::LLMInterface::uploadMediaFile ( const std::string & local_path,
const std::string & mime,
const std::string & binary = "" )
virtual

Provider-optional: Upload a local media file to the provider's file storage and return a canonical media envelope (e.g., with fileUri). Default: not supported.

Parameters
local_pathLocal filesystem path
mimeThe MIME type of the media file
binaryOptional binary content of the media file
Returns
Optional envelope; std::nullopt if unsupported

Reimplemented in agents::llms::AnthropicLLM, and agents::llms::GoogleLLM.