Changing Models

AgentKit is model-agnostic, and changing which model depends on which framework you are using.

These LangChain guides in Python and Node.js will help you change the model for your LangChain agent.

For the Node.js version, you can change the model similarly.

For any model that can be accessed through the OpenAI standard, you can change the BaseURL as specified by your model provider’s documentation. You can also specify additional configuration options such as maxTokens, temperature, and topP. Here are some examples:

# requires AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY as env variables
from langchain_aws import ChatBedrockConverse

llm = ChatBedrockConverse(
    model_id="us.amazon.nova-pro-v1:0", # Change to any Bedrock Model ID 
    max_tokens=1024,
    # other params...
    )

Prompting Guide

The prompts you use for your agent can significantly impact its personality and performance. Here are some tips to help you get started:

  • Agent State Modifier: The agent state modifier is the first thing the agent sees. It is used to set the agent’s initial state and can be used to set the agent’s personality. If there are specific instructions for the agent, they should be included here. However, do not treat it as as a place for guardrails, as a well-written prompt can get around text-based instructions.
  • Action Prompt: Every action has an associated prompt that gives the agent context on how to use the action. If there are specific things an agent should know about the parameters, how to ask the user for clarification, or assumptions to make in default cases when the user does not mention a specific parameter, this prompt is a great place to include them.