On this tutorial, we’ll construct an finish‑to‑finish ticketing assistant powered by Agentic AI utilizing the PydanticAI library. We’ll outline our information guidelines with Pydantic v2 fashions, retailer tickets in an in‑reminiscence SQLite database, and generate distinctive identifiers with Python’s uuid module. Behind the scenes, two brokers, one for creating tickets and one for checking standing, leverage Google Gemini (through PydanticAI’s google-gla supplier) to interpret your pure‑language prompts and name our customized database capabilities. The result’s a clear, sort‑secure workflow you possibly can run instantly in Colab.
!pip set up --upgrade pip
!pip set up pydantic-ai
First, these two instructions replace your pip installer to the newest model, bringing in new options and safety patches, after which set up PydanticAI. This library allows the definition of type-safe AI brokers and the mixing of Pydantic fashions with LLMs.
import os
from getpass import getpass
if "GEMINI_API_KEY" not in os.environ:
os.environ["GEMINI_API_KEY"] = getpass("Enter your Google Gemini API key: ")
We verify whether or not the GEMINI_API_KEY atmosphere variable is already set. If not, we securely immediate you (with out echoing) to enter your Google Gemini API key at runtime, then retailer it in os.environ in order that your Agentic AI calls can authenticate mechanically.
!pip set up nest_asyncio
We set up the nest_asyncio bundle, which helps you to patch the present asyncio occasion loop as a way to name async capabilities (or use .run_sync()) inside environments like Colab with out working into “occasion loop already working” errors.
import sqlite3
import uuid
from dataclasses import dataclass
from typing import Literal
from pydantic import BaseModel, Subject
from pydantic_ai import Agent, RunContext
We herald Python’s sqlite3 for our in‑reminiscence database and uuid to generate distinctive ticket IDs, use dataclass and Literal for clear dependency and sort definitions, and cargo Pydantic’s BaseModel/Subject for imposing information schemas alongside Agent and RunContext from PydanticAI to wire up and run our conversational brokers.
conn = sqlite3.join(":reminiscence:")
conn.execute("""
CREATE TABLE tickets (
ticket_id TEXT PRIMARY KEY,
abstract TEXT NOT NULL,
severity TEXT NOT NULL,
division TEXT NOT NULL,
standing TEXT NOT NULL
)
""")
conn.commit()
We arrange an in‑reminiscence SQLite database and outline a tickets desk with columns for ticket_id, abstract, severity, division, and standing, then commit the schema so you will have a light-weight, transient retailer for managing your ticket information.
@dataclass
class TicketingDependencies:
"""Carries our DB connection into system prompts and instruments."""
db: sqlite3.Connection
class CreateTicketOutput(BaseModel):
ticket_id: str = Subject(..., description="Distinctive ticket identifier")
abstract: str = Subject(..., description="Textual content abstract of the difficulty")
severity: Literal["low","medium","high"] = Subject(..., description="Urgency stage")
division: str = Subject(..., description="Accountable division")
standing: Literal["open"] = Subject("open", description="Preliminary ticket standing")
class TicketStatusOutput(BaseModel):
ticket_id: str = Subject(..., description="Distinctive ticket identifier")
standing: Literal["open","in_progress","resolved"] = Subject(..., description="Present ticket standing")
Right here, we outline a easy TicketingDependencies dataclass to cross our SQLite connection into every agent name, after which declare two Pydantic fashions: CreateTicketOutput (with fields for ticket ID, abstract, severity, division, and default standing “open”) and TicketStatusOutput (with ticket ID and its present standing). These fashions implement a transparent, validated construction on the whole lot our brokers return, making certain you all the time obtain well-formed information.
create_agent = Agent(
"google-gla:gemini-2.0-flash",
deps_type=TicketingDependencies,
output_type=CreateTicketOutput,
system_prompt="You're a ticketing assistant. Use the `create_ticket` instrument to log new points."
)
@create_agent.instrument
async def create_ticket(
ctx: RunContext[TicketingDependencies],
abstract: str,
severity: Literal["low","medium","high"],
division: str
) -> CreateTicketOutput:
"""
Logs a brand new ticket within the database.
"""
tid = str(uuid.uuid4())
ctx.deps.db.execute(
"INSERT INTO tickets VALUES (?,?,?,?,?)",
(tid, abstract, severity, division, "open")
)
ctx.deps.db.commit()
return CreateTicketOutput(
ticket_id=tid,
abstract=abstract,
severity=severity,
division=division,
standing="open"
)
We create a PydanticAI Agent named’ create_agent’ that’s wired to Google Gemini and is conscious of our SQLite connection (deps_type=TicketingDependencies) and output schema (CreateTicketOutput). The @create_agent.instrument decorator then registers an async create_ticket perform, which generates a UUID, inserts a brand new row into the tickets desk, and returns a validated CreateTicketOutput object.
status_agent = Agent(
"google-gla:gemini-2.0-flash",
deps_type=TicketingDependencies,
output_type=TicketStatusOutput,
system_prompt="You're a ticketing assistant. Use the `get_ticket_status` instrument to retrieve present standing."
)
@status_agent.instrument
async def get_ticket_status(
ctx: RunContext[TicketingDependencies],
ticket_id: str
) -> TicketStatusOutput:
"""
Fetches the ticket standing from the database.
"""
cur = ctx.deps.db.execute(
"SELECT standing FROM tickets WHERE ticket_id = ?", (ticket_id,)
)
row = cur.fetchone()
if not row:
increase ValueError(f"No ticket discovered for ID {ticket_id!r}")
return TicketStatusOutput(ticket_id=ticket_id, standing=row[0])
We arrange a second PydanticAI Agent, status_agent, additionally utilizing the Google Gemini supplier and our shared TicketingDependencies. It registers an async get_ticket_status instrument that appears up a given ticket_id within the SQLite database and returns a validated TicketStatusOutput, or raises an error if the ticket isn’t discovered.
deps = TicketingDependencies(db=conn)
create_result = await create_agent.run(
"My printer on third ground exhibits a paper jam error.", deps=deps
)
print("Created Ticket →")
print(create_result.output.model_dump_json(indent=2))
tid = create_result.output.ticket_id
status_result = await status_agent.run(
f"What is the standing of ticket {tid}?", deps=deps
)
print("Ticket Standing →")
print(status_result.output.model_dump_json(indent=2))
Lastly, we first bundle your SQLite connection into deps, then ask the create_agent to log a brand new ticket through a pure‑language immediate, printing the validated ticket information as JSON. It then takes the returned ticket_id, queries the status_agent for that ticket’s present state, and prints the standing in JSON kind.
In conclusion, you will have seen how Agentic AI and PydanticAI work collectively to automate a whole service course of, from logging a brand new situation to retrieving its reside standing, all managed by way of conversational prompts. Our use of Pydantic v2 ensures each ticket matches the schema you outline, whereas SQLite offers a light-weight backend that’s straightforward to interchange with any database. With these instruments in place, you possibly can broaden the assistant, including new agent capabilities, integrating different AI fashions like openai:gpt-4o, or connecting actual‑world APIs, assured that your information stays structured and dependable all through.
Right here is the Colab Pocket book. Additionally, don’t overlook to comply with us on Twitter and be part of our Telegram Channel and LinkedIn Group. Don’t Neglect to hitch our 90k+ ML SubReddit.
Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its recognition amongst audiences.