Use this file to discover all available pages before exploring further.
Google Colab
AutoGen is a framework from Microsoft that enables the development of LLM applications using multiple agents that can converse with each other to solve tasks.
Here’s a basic example of setting up and running a simple AutoGen task. Ensure your OAI_CONFIG_LIST is correctly pointing to your LLM configuration (e.g., using the OPENAI_API_KEY environment variable).
import autogen# Configuration for the LLM (e.g., OpenAI)# Ensure OPENAI_API_KEY is set in your environment if model is not "stub"config_list = autogen.config_list_from_json( env_or_file="OAI_CONFIG_LIST", # Example: Set this env var to your config list # Sample OAI_CONFIG_LIST content (save as a JSON file or set as ENV var): # [ # { # "model": "gpt-3.5-turbo", # "api_key": "YOUR_OPENAI_API_KEY" # Can also be picked from env # } # ] filter_dict={ "model": ["gpt-3.5-turbo", "gpt-4", "gpt-4o"] # Specify models you want to use })# Create agentsassistant = autogen.AssistantAgent( name="assistant", llm_config={ "config_list": config_list, "temperature": 0 })user_proxy = autogen.UserProxyAgent( name="user_proxy", human_input_mode="NEVER", max_consecutive_auto_reply=5, is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"), code_execution_config=False, # Set to a dict with work_dir if you need code execution)# Start a chatuser_proxy.initiate_chat( assistant, message="What is the capital of France? Reply TERMINATE when done.",)print("AutoGen chat initiated. Check Arize for traces.")
Traces from both the AutoGen framework interactions and the underlying LLM calls will be sent to your Arize account. This allows you to see the conversation flow between agents and the details of each LLM call.