On this tutorial, we demonstrated how Microsoft’s AutoGen framework empowers builders to orchestrate advanced, multi-agent workflows with minimal code. By leveraging AutoGen’s RoundRobinGroupChat and TeamTool abstractions, you’ll be able to seamlessly assemble specialist assistants, akin to Researchers, FactCheckers, Critics, Summarizers, and Editors, right into a cohesive “DeepDive” software. AutoGen handles the intricacies of flip‐taking, termination circumstances, and streaming output, permitting you to deal with defining every agent’s experience and system prompts slightly than plumbing collectively callbacks or handbook immediate chains. Whether or not conducting in‐depth analysis, validating info, refining prose, or integrating third‐occasion instruments, AutoGen supplies a unified API that scales from easy two‐agent pipelines to elaborate, 5‐agent collaboratives.
!pip set up -q autogen-agentchat[gemini] autogen-ext[openai] nest_asyncio
We set up the AutoGen AgentChat package deal with Gemini assist, the OpenAI extension for API compatibility, and the nest_asyncio library to patch the pocket book’s occasion loop, making certain you might have all of the parts wanted to run asynchronous, multi-agent workflows in Colab.
import os, nest_asyncio
from getpass import getpass
nest_asyncio.apply()
os.environ["GEMINI_API_KEY"] = getpass("Enter your Gemini API key: ")
We import and apply nest_asyncio to allow nested occasion loops in pocket book environments, then securely immediate on your Gemini API key utilizing getpass and retailer it in os.environ for authenticated mannequin consumer entry.
from autogen_ext.fashions.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
mannequin="gemini-1.5-flash-8b",
api_key=os.environ["GEMINI_API_KEY"],
api_type="google",
)
We initialize an OpenAI‐appropriate chat consumer pointed at Google’s Gemini by specifying the gemini-1.5-flash-8b mannequin, injecting your saved Gemini API key, and setting api_type=”google”, supplying you with a ready-to-use model_client for downstream AutoGen brokers.
from autogen_agentchat.brokers import AssistantAgent
researcher = AssistantAgent(title="Researcher", system_message="Collect and summarize factual information.", model_client=model_client)
factchecker = AssistantAgent(title="FactChecker", system_message="Confirm info and cite sources.", model_client=model_client)
critic = AssistantAgent(title="Critic", system_message="Critique readability and logic.", model_client=model_client)
summarizer = AssistantAgent(title="Summarizer",system_message="Condense into a short govt abstract.", model_client=model_client)
editor = AssistantAgent(title="Editor", system_message="Polish language and sign APPROVED when achieved.", model_client=model_client)
We outline 5 specialised assistant brokers, Researcher, FactChecker, Critic, Summarizer, and Editor, every initialized with a role-specific system message and the shared Gemini-powered mannequin consumer, enabling them to collect info, respectively, confirm accuracy, critique content material, condense summaries, and polish language throughout the AutoGen workflow.
from autogen_agentchat.groups import RoundRobinGroupChat
from autogen_agentchat.circumstances import MaxMessageTermination, TextMentionTermination
max_msgs = MaxMessageTermination(max_messages=20)
text_term = TextMentionTermination(textual content="APPROVED", sources=["Editor"])
termination = max_msgs | text_term
workforce = RoundRobinGroupChat(
contributors=[researcher, factchecker, critic, summarizer, editor],
termination_condition=termination
)
We import the RoundRobinGroupChat class together with two termination circumstances, then compose a cease rule that fires after 20 whole messages or when the Editor agent mentions “APPROVED.” Lastly, it instantiates a round-robin workforce of the 5 specialised brokers with that mixed termination logic, enabling them to cycle by means of analysis, fact-checking, critique, summarization, and modifying till one of many cease circumstances is met.
from autogen_agentchat.instruments import TeamTool
deepdive_tool = TeamTool(workforce=workforce, title="DeepDive", description="Collaborative multi-agent deep dive")
WE wrap our RoundRobinGroupChat workforce in a TeamTool named “DeepDive” with a human-readable description, successfully packaging the complete multi-agent workflow right into a single callable software that different brokers can invoke seamlessly.
host = AssistantAgent(
title="Host",
model_client=model_client,
instruments=[deepdive_tool],
system_message="You've gotten entry to a DeepDive software for in-depth analysis."
)
We create a “Host” assistant agent configured with the shared Gemini-powered model_client, grant it the DeepDive workforce software for orchestrating in-depth analysis, and prime it with a system message that informs it of its potential to invoke the multi-agent DeepDive workflow.
import asyncio
async def run_deepdive(matter: str):
outcome = await host.run(process=f"Deep dive on: {matter}")
print("🔍 DeepDive outcome:n", outcome)
await model_client.shut()
matter = "Impacts of Mannequin Context Protocl on Agentic AI"
loop = asyncio.get_event_loop()
loop.run_until_complete(run_deepdive(matter))
Lastly, we outline an asynchronous run_deepdive operate that tells the Host agent to execute the DeepDive workforce software on a given matter, prints the great outcome, after which closes the mannequin consumer; it then grabs Colab’s present asyncio loop and runs the coroutine to completion for a seamless, synchronous execution.
In conclusion, integrating Google Gemini through AutoGen’s OpenAI‐appropriate consumer and wrapping our multi‐agent workforce as a callable TeamTool provides us a robust template for constructing extremely modular and reusable workflows. AutoGen abstracts away occasion loop administration (with nest_asyncio), streaming responses, and termination logic, enabling us to iterate rapidly on agent roles and total orchestration. This superior sample streamlines the event of collaborative AI methods and lays the inspiration for extending into retrieval pipelines, dynamic selectors, or conditional execution methods.
Try the Pocket book right here. All credit score for this analysis goes to the researchers of this challenge. Additionally, be at liberty to comply with us on Twitter and don’t overlook to affix our 95k+ ML SubReddit and Subscribe to our E-newsletter.
Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its recognition amongst audiences.