Coding Guide for Asynchronous Ticket Conductors Using Pydanticai Agent, Pydantic V2 and SQLite Databases

In this tutorial, we will use the Pydanticai library to build an end ticket seller powered by proxy AI. We will use the Pydantic V2 model to define our data rules, store tickets in an in-memory SQLITE database, and generate unique identifiers using Python’s UUID module. Behind the scenes, two agents, one for creating tickets, and the other for checking status, leveraging Google Gemini (via Pydanticai’s Google-GLA provider) to interpret your natural language prompts and call our custom database features. The result is a clean, type of secure workflow that you can run immediately in Colab.
!pip install --upgrade pip
!pip install pydantic-ai
First, these two commands update your PIP installer to the latest version, bringing new features and security patches, and then installing Pydanticai. This library can define type protection AI proxy and integration of Pydantic models with LLMS.
import os
from getpass import getpass
if "GEMINI_API_KEY" not in os.environ:
os.environ["GEMINI_API_KEY"] = getpass("Enter your Google Gemini API key: ")
We check if the GEMINI_API_KEY environment variable has been set. If not, we’ll be safely prompted (without echoing) to enter the Google Gemini API key at runtime and then store it in OS.Environ so that your proxy AI calls can be authenticated automatically.
!pip install nest_asyncio
We installed the Nest_Asyncio package, which allows you to patch existing Asyncio event loops so that you can call Async functions in an environment like Colab (or use .run_sync()) without encountering the “Event Loop Already Run” error.
import sqlite3
import uuid
from dataclasses import dataclass
from typing import Literal
from pydantic import BaseModel, Field
from pydantic_ai import Agent, RunContext
We use Python’s SQLite3 for our in-memory database and UUID to generate unique ticketing IDs, use Dataclass and Lirtalal for clear dependencies and type definitions, and load Pydantic’s BaseModel/Field to use Agent and Runcontext from Pydanticai to wires and enable our conversation agent to implement the data architecture.
conn = sqlite3.connect(":memory:")
conn.execute("""
CREATE TABLE tickets (
ticket_id TEXT PRIMARY KEY,
summary TEXT NOT NULL,
severity TEXT NOT NULL,
department TEXT NOT NULL,
status TEXT NOT NULL
)
""")
conn.commit()
We set up a memory SQLITE database and define a ticket table with columns of ticket_id, summary, severity, department and status, and then schema so that you have a lightweight transient store to manage ticketing records.
@dataclass
class TicketingDependencies:
"""Carries our DB connection into system prompts and tools."""
db: sqlite3.Connection
class CreateTicketOutput(BaseModel):
ticket_id: str = Field(..., description="Unique ticket identifier")
summary: str = Field(..., description="Text summary of the issue")
severity: Literal["low","medium","high"] = Field(..., description="Urgency level")
department: str = Field(..., description="Responsible department")
status: Literal["open"] = Field("open", description="Initial ticket status")
class TicketStatusOutput(BaseModel):
ticket_id: str = Field(..., description="Unique ticket identifier")
status: Literal["open","in_progress","resolved"] = Field(..., description="Current ticket status")
Here we define a simple ticketingDepentencies data to pass our SQLITE connection into each proxy call, and then declare two Pydantic models: CreateTickEtOutput (with ticket ID, digest, severity, department and default state “Open” field, “OPEN”) and ticket chart (with ticket ID and its current state). These models enforce a clear, proven structure on everything our agent returns to ensure you always receive molded data.
create_agent = Agent(
"google-gla:gemini-2.0-flash",
deps_type=TicketingDependencies,
output_type=CreateTicketOutput,
system_prompt="You are a ticketing assistant. Use the `create_ticket` tool to log new issues."
)
@create_agent.tool
async def create_ticket(
ctx: RunContext[TicketingDependencies],
summary: str,
severity: Literal["low","medium","high"],
department: str
) -> CreateTicketOutput:
"""
Logs a new ticket in the database.
"""
tid = str(uuid.uuid4())
ctx.deps.db.execute(
"INSERT INTO tickets VALUES (?,?,?,?,?)",
(tid, summary, severity, department, "open")
)
ctx.deps.db.commit()
return CreateTicketOutput(
ticket_id=tid,
summary=summary,
severity=severity,
department=department,
status="open"
)
We created a Pydanticai agent called “create_agent” which is connected to Google Gemini and is aware of our sqlite connection (dep_type = ticketingDepptimems) and output architecture (createTickEtIckEtotput). Then, the @create_agent.tool decorator registers an async create_ticket function that generates uuid, inserts new rows into the ticket table, and returns the verified createTickEtICKETOUTPUT object.
status_agent = Agent(
"google-gla:gemini-2.0-flash",
deps_type=TicketingDependencies,
output_type=TicketStatusOutput,
system_prompt="You are a ticketing assistant. Use the `get_ticket_status` tool to retrieve current status."
)
@status_agent.tool
async def get_ticket_status(
ctx: RunContext[TicketingDependencies],
ticket_id: str
) -> TicketStatusOutput:
"""
Fetches the ticket status from the database.
"""
cur = ctx.deps.db.execute(
"SELECT status FROM tickets WHERE ticket_id = ?", (ticket_id,)
)
row = cur.fetchone()
if not row:
raise ValueError(f"No ticket found for ID {ticket_id!r}")
return TicketStatusOutput(ticket_id=ticket_id, status=row[0])
We also set up a second Pydanticai Agent, Sates_agent, using Google Gemini provider and our shared ticketing dependencies. It registers an asynchronous GET_TICKET_STATUS tool that looks up the given Ticket_id in the SQLite database and returns the verified TicketStatusOutputput, or if the ticket is not found, it causes an error.
deps = TicketingDependencies(db=conn)
create_result = await create_agent.run(
"My printer on 3rd floor shows a paper jam error.", deps=deps
)
print("Created Ticket →")
print(create_result.output.model_dump_json(indent=2))
tid = create_result.output.ticket_id
status_result = await status_agent.run(
f"What's the status of ticket {tid}?", deps=deps
)
print("Ticket Status →")
print(status_result.output.model_dump_json(indent=2))
Finally, we first wrap your sqlite connection into deps, then ask create_agent to record new tickets via natural language prompts and print the verified ticket data to JSON. Then, it takes the returned ticket_id, querys the status_agent of the ticket’s current status, and prints the status in a JSON form.
In short, you’ve seen how proxy AI and Pydanticai work together to automate a complete service process, all managed through conversation prompts, from logging new issues to retrieving real-time status. Our use of Pydantic V2 ensures that each ticket matches the pattern you define, while SQLITE provides a lightweight backend that is easy to replace with any database. With these tools, you can extend the Assistant, add new proxy capabilities, integrate other AI models such as OpenAI:GPT-4O or a real-world API that ensures your data remains structural and reliable throughout the process.
This is COLAB notebook. Also, don’t forget to follow us twitter And join us Telegram Channel and LinkedIn GrOUP. Don’t forget to join us 90K+ ml reddit.
🔥 [Register Now] Minicon Agesic AI Virtual Conference: Free Registration + Certificate of Attendance + 4-hour Short Event (May 21, 9am-1pm) + Hands-On the Workshop
Asif Razzaq is CEO of Marktechpost Media Inc. As a visionary entrepreneur and engineer, ASIF is committed to harnessing the potential of artificial intelligence to achieve social benefits. His recent effort is to launch Marktechpost, an artificial intelligence media platform that has an in-depth coverage of machine learning and deep learning news that can sound both technically, both through technical voices and be understood by a wide audience. The platform has over 2 million views per month, demonstrating its popularity among its audience.
