Key Components
LLMClient
LLM Client object configuration
The LLMClient
object from superagentx.llm
is a configuration-driven interface to interact with large language models (LLMs) from various
providers, such as OpenAI, AWSBedrock and more. It allows users to set parameters specific to the model and
provider they intend to use, making it easier to configure different types of LLMs and interact with it.
Supported LLMs
Icon | LLM Name |
---|---|
OpenAI | |
Azure OpenAI | |
AWS Bedrock | |
Google Gemini | |
Meta Llama | |
Ollama | |
Claude AI | |
Mistral AI | |
IBM WatsonX |
Parameters
Attribute | Parameters | Description |
---|---|---|
Model (optional) | model | This specifies the language model you wish to use. The model is typically identified by a name or version, such as ‘gpt-4o’, ‘gpt-3.5-turbo’, or another model version that the service provider supports. |
LLM Type | llm_type | This key identifies the service provider or platform from which the language model is sourced. Common values are: ‘openai’: Refers to OpenAI’s GPT models. |
API Key (optional) | api_key | You can provide the model API key either as a parameter or retrieve it from the environment. |
Base Url (optional) | base_url | The base URL for the model API. |
API Version (optional) | api_version | The required API version for the Azure OpenAI llm_type. |
Async Mode (optional) | async_mode | Asynchronous mode of OpenAI or Azure OpenAI client. |
Embedding Model (optional) | embed_model | Embedding model name, supported models openai, azure-openai, mistral, llama 3.1 |
OpenAI
AWSBedrock
LLM Type and its Values
Type | value | Example |
---|---|---|
Open AI | openai | llm_config = { 'llm_type':'openai'} |
Azure OpenAI | azure-openai | llm_config = { 'llm_type':'azure-openai'} |
AWS Bedrock | bedrock | llm_config = { 'llm_type':'bedrock'} |
LLama | llama | llm_config = { 'llm_type':'llama'} |
Gemini | gemini | llm_config = { 'llm_type':'gemini'} |
Anthropic | anthropic | llm_config = { 'llm_type':'anthropic'} |
Mistral | mistral | llm_config = { 'llm_type':'mistral'} |
Groq | groq | llm_config = { 'llm_type':'groq'} |
Together | together | llm_config = { 'llm_type':'together'} |
OpenAI Supported Models
Type | Example |
---|---|
gpt-4o | llm_config = { "model":'gpt-4o', "llm_type":'openai'} |
gpt-4 | llm_config = { "model":'gpt-4', "llm_type":'openai'} |
gpt-4o-2024-05-13 | llm_config = { "model":'gpt-4o-2024-05-13', "llm_type":'openai'} |
gpt-4o-2024-08-06 | llm_config = { "model":'gpt-4o-2024-08-06', "llm_type":'openai'} |
gpt-4-turbo-2024-04-09 | llm_config = { "model":'gpt-4-turbo-2024-04-09', "llm_type":'openai'} |
gpt-4o-mini-2024-07-18 | llm_config = { "model":'gpt-4o-mini-2024-07-18', "llm_type":'openai'} |
gpt-4o-mini | llm_config = { "model":'gpt-4o-mini', "llm_type":'openai'} |
Bedrock Supported Models
Type | Example |
---|---|
anthropic.claude-3-5-sonnet-20241022-v2:0 | llm_config = { "model":'anthropic.claude-3-5-sonnet-20241022-v2:0', "llm_type":'bedrock'} |
anthropic.claude-3-5-haiku-20241022-v1:0 | llm_config = { "model":'anthropic.claude-3-5-haiku-20241022-v1:0', "llm_type":'bedrock'} |
anthropic.claude-instant-v1:2:100k | llm_config = {"model":'anthropic.claude-instant-v1:2:100k',"llm_type":'bedrock'} |
anthropic.claude-3-sonnet-20240229-v1:0 | llm_config = {"model":'anthropic.claude-3-sonnet-20240229-v1:0',"llm_type":'bedrock'} |
anthropic.claude-3-haiku-20240307-v1:0 | llm_config = {"model":'anthropic.claude-3-haiku-20240307-v1:0',"llm_type":'bedrock'} |
anthropic.claude-3-opus-20240229-v1:0 | llm_config = {"model":'anthropic.claude-3-opus-20240229-v1:0',"llm_type":'bedrock'} |
anthropic.claude-3-5-sonnet-20240620-v1:0 | llm_config = {"model":'anthropic.claude-3-5-sonnet-20240620-v1:0',"llm_type":'bedrock'} |
cohere.command-r-v1:0 | llm_config = {"model":'cohere.command-r-v1:0',"llm_type":'bedrock'} |
cohere.command-r-plus-v1:0 | llm_config = {"model":'cohere.command-r-plus-v1:0',"llm_type":'bedrock'} |
meta.llama3-1-8b-instruct-v1:0 | llm_config = {"model":'meta.llama3-1-8b-instruct-v1:0',"llm_type":'bedrock'} |
meta.llama3-1-70b-instruct-v1:0 | llm_config = {"model":'meta.llama3-1-70b-instruct-v1:0',"llm_type":'bedrock'} |
meta.llama3-1-405b-instruct-v1:0 | llm_config = {"model":'meta.llama3-1-405b-instruct-v1:0',"llm_type":'bedrock'} |
mistral.mistral-large-2402-v1:0 | llm_config = {"model":'mistral.mistral-large-2402-v1:0',"llm_type":'bedrock'} |
mistral.mistral-large-2407-v1:0 | llm_config = {"model":'mistral.mistral-large-2407-v1:0',"llm_type":'bedrock'} |