LLM Client object configuration

The LLMClient object from superagentx.llm is a configuration-driven interface to interact with large language models (LLMs) from various providers, such as OpenAI, AWSBedrock and more. It allows users to set parameters specific to the model and provider they intend to use, making it easier to configure different types of LLMs and interact with it.

Supported LLMs

IconLLM Name
OpenAI
Azure OpenAI
AWS Bedrock
Google Gemini
Meta Llama
Ollama
Claude AI
Mistral AI
IBM WatsonX

Parameters

AttributeParametersDescription
Model (optional)modelThis specifies the language model you wish to use. The model is typically identified by a name or version, such as ‘gpt-4o’, ‘gpt-3.5-turbo’, or another model version that the service provider supports.
LLM Typellm_typeThis key identifies the service provider or platform from which the language model is sourced. Common values are: ‘openai’: Refers to OpenAI’s GPT models.
API Key (optional)api_keyYou can provide the model API key either as a parameter or retrieve it from the environment.
Base Url (optional)base_urlThe base URL for the model API.
API Version (optional)api_versionThe required API version for the Azure OpenAI llm_type.
Async Mode (optional)async_modeAsynchronous mode of OpenAI or Azure OpenAI client.
Embedding Model (optional)embed_modelEmbedding model name, supported models openai, azure-openai, mistral, llama 3.1
OpenAI
from superagentx.llm import LLMClient

llm_config = {
    "model":'gpt-4o',
    "llm_type":'openai'
}
llm_client = LLMClient(llm_config=llm_config)
AWSBedrock
from superagentx.llm import LLMClient

llm_config = {
    "model":'anthropic.claude-3-5-sonnet-20241022-v2:0',
    "llm_type":'bedrock'
}
llm_client = LLMClient(llm_config=llm_config)

LLM Type and its Values

TypevalueExample
Open AIopenaillm_config = { 'llm_type':'openai'}
Azure OpenAIazure-openaillm_config = { 'llm_type':'azure-openai'}
AWS Bedrockbedrockllm_config = { 'llm_type':'bedrock'}
LLamallamallm_config = { 'llm_type':'llama'}
Geminigeminillm_config = { 'llm_type':'gemini'}
Anthropicanthropicllm_config = { 'llm_type':'anthropic'}
Mistralmistralllm_config = { 'llm_type':'mistral'}
Groqgroqllm_config = { 'llm_type':'groq'}
Togethertogetherllm_config = { 'llm_type':'together'}

OpenAI Supported Models

TypeExample
gpt-4ollm_config = { "model":'gpt-4o', "llm_type":'openai'}
gpt-4llm_config = { "model":'gpt-4', "llm_type":'openai'}
gpt-4o-2024-05-13llm_config = { "model":'gpt-4o-2024-05-13', "llm_type":'openai'}
gpt-4o-2024-08-06llm_config = { "model":'gpt-4o-2024-08-06', "llm_type":'openai'}
gpt-4-turbo-2024-04-09llm_config = { "model":'gpt-4-turbo-2024-04-09', "llm_type":'openai'}
gpt-4o-mini-2024-07-18llm_config = { "model":'gpt-4o-mini-2024-07-18', "llm_type":'openai'}
gpt-4o-minillm_config = { "model":'gpt-4o-mini', "llm_type":'openai'}

Bedrock Supported Models

TypeExample
anthropic.claude-3-5-sonnet-20241022-v2:0llm_config = { "model":'anthropic.claude-3-5-sonnet-20241022-v2:0', "llm_type":'bedrock'}
anthropic.claude-3-5-haiku-20241022-v1:0llm_config = { "model":'anthropic.claude-3-5-haiku-20241022-v1:0', "llm_type":'bedrock'}
anthropic.claude-instant-v1:2:100kllm_config = {"model":'anthropic.claude-instant-v1:2:100k',"llm_type":'bedrock'}
anthropic.claude-3-sonnet-20240229-v1:0llm_config = {"model":'anthropic.claude-3-sonnet-20240229-v1:0',"llm_type":'bedrock'}
anthropic.claude-3-haiku-20240307-v1:0llm_config = {"model":'anthropic.claude-3-haiku-20240307-v1:0',"llm_type":'bedrock'}
anthropic.claude-3-opus-20240229-v1:0llm_config = {"model":'anthropic.claude-3-opus-20240229-v1:0',"llm_type":'bedrock'}
anthropic.claude-3-5-sonnet-20240620-v1:0llm_config = {"model":'anthropic.claude-3-5-sonnet-20240620-v1:0',"llm_type":'bedrock'}
cohere.command-r-v1:0llm_config = {"model":'cohere.command-r-v1:0',"llm_type":'bedrock'}
cohere.command-r-plus-v1:0llm_config = {"model":'cohere.command-r-plus-v1:0',"llm_type":'bedrock'}
meta.llama3-1-8b-instruct-v1:0llm_config = {"model":'meta.llama3-1-8b-instruct-v1:0',"llm_type":'bedrock'}
meta.llama3-1-70b-instruct-v1:0llm_config = {"model":'meta.llama3-1-70b-instruct-v1:0',"llm_type":'bedrock'}
meta.llama3-1-405b-instruct-v1:0llm_config = {"model":'meta.llama3-1-405b-instruct-v1:0',"llm_type":'bedrock'}
mistral.mistral-large-2402-v1:0llm_config = {"model":'mistral.mistral-large-2402-v1:0',"llm_type":'bedrock'}
mistral.mistral-large-2407-v1:0llm_config = {"model":'mistral.mistral-large-2407-v1:0',"llm_type":'bedrock'}