Step-by-step coding guide for defining custom model context protocol (MCP) servers and client tools using FASTMCP and integrating them into Google Gemini 2.0’s feature function workflow

In this Colab -Ready tutorial, we demonstrate how to use FastMCP to integrate Google’s Gemini 2.0 generation AI with an internal Process Model Context Protocol (MCP) server. Starting with an interactive getpass prompt to capture your GEMINI_API_KEY securely, we install and configure all necessary dependencies: the google‑genai Python client for calling the Gemini API, fastmcp for defining and hosting our MCP tools in‑process, httpx for making HTTP requests to the Open‑Meteo weather API, and nest_asyncio to patch Colab’s already‑running asyncio event The workflow uses two tools (get_weather (latitude, longitude)) to rotate the minimum FastMCP “weather” server for three-day predictions and get_alerts (status) (status) to get the state-level weather alerts, and then create a FastMCP transport to connect the MCP client to that server. Finally, using the Gemini function call function, we send a natural language prompt to Gemini, please issue a function call based on our explicit JSON architecture, and then execute that call through the MCP client to return the structured weather data to our notebook.
from getpass import getpass
import os
api_key = getpass("Enter your GEMINI_API_KEY: ")
os.environ["GEMINI_API_KEY"] = api_key
We can safely prompt you for the Gemini API key (not showing it on the screen) and then store it in the Gemini_api_key environment variable, allowing the rest of the notebook to authenticate using Google’s API.
!pip install -q google-genai mcp fastmcp httpx nest_asyncio
We installed all the core dependencies required for COLAB notebooks in one go – Google -Genai interacts with Gemini API, MCP and FastMCP for building and hosting our model context protocol server and client, HTTPX, for making HTTP requests to external APIs, and HTTP for external APIs and NEST_ASYNCIO, and for our Assync Code Assync completely plotsy comply andync Code.
We apply the NEST_ASYNCIO patch to the existing event loop of the notebook, allowing us to run Asyncio Coroutines without encountering an “already running event loop” error (e.g. MCP client interaction).
from fastmcp import FastMCP
import httpx
mcp_server = FastMCP("weather")
@mcp_server.tool()
def get_weather(latitude: float, longitude: float) -> str:
"""3‑day min/max temperature forecast via Open‑Meteo."""
url = (
f"
f"?latitude={latitude}&longitude={longitude}"
"&daily=temperature_2m_min,temperature_2m_max&timezone=UTC"
)
resp = httpx.get(url, timeout=10)
daily = resp.json()["daily"]
return "n".join(
f"{date}: low {mn}°C, high {mx}°C"
for date, mn, mx in zip(
daily["time"],
daily["temperature_2m_min"],
daily["temperature_2m_max"],
)
)
@mcp_server.tool()
def get_alerts(state: str) -> str:
"""Dummy US‑state alerts."""
return f"No active weather alerts for {state.upper()}."
We created an internal process called “Weather” FastMCP server and registered two tools: get_weather (latitude, longitude), which uses HTTPX to get and format a 3-day temperature forecast from the Open -Meteo API, and Get_Alerts (state), which returns a placeholder message for US states to get filler messages for US state status.
import asyncio
from google import genai
from google.genai import types
from fastmcp import Client as MCPClient
from fastmcp.client.transports import FastMCPTransport
We use the core library of MCP ‑Gemini integration: Asyncio runs asynchronous code, Google -genai and its type modules to call Gemini and define functional examples, and FastMCP’s Aliase McPclient and use its FastMCP transPort to connect our Insprocess Weecents in -in -process Weecents the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP the McP.
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
MODEL = "gemini-2.0-flash"
transport = FastMCPTransport(mcp_server)
We initialize the Google Gemini client using gemini_api_key from your environment, specify the GEMINI-2.0-FLASH model for functional calls, and set up a FastMCP transport that connects the internal program MCP_Server to the MCP client.
function_declarations = [
{
"name": "get_weather",
"description": "Return a 3‑day min/max temperature forecast for given coordinates.",
"parameters": {
"type": "object",
"properties": {
"latitude": {
"type": "number",
"description": "Latitude of target location."
},
"longitude": {
"type": "number",
"description": "Longitude of target location."
}
},
"required": ["latitude", "longitude"]
}
},
{
"name": "get_alerts",
"description": "Return any active weather alerts for a given U.S. state.",
"parameters": {
"type": "object",
"properties": {
"state": {
"type": "string",
"description": "Two‑letter U.S. state code, e.g. 'CA'."
}
},
"required": ["state"]
}
}
]
tool_defs = types.Tool(function_declarations=function_declarations)
We manually define the JSON schema specification for our two MCP tools (get_weather (accepts latitude and longitude) as numeric input) and get_alerts (accepts US status codes as strings), including name, description, required attributes, and data types (accepts US status codes). Then, wrap these declarations in the type. Tool object (tool_defs), which informs Gemini how to generate and verify corresponding function calls.
async def run_gemini(lat: float, lon: float):
async with MCPClient(transport) as mcp_client:
prompt = f"Give me a 3‑day weather forecast for latitude={lat}, longitude={lon}."
response = client.models.generate_content(
model=MODEL,
contents=[prompt],
config=types.GenerateContentConfig(
temperature=0,
tools=[tool_defs]
)
)
call = response.candidates[0].content.parts[0].function_call
if not call:
print("No function call; GPT said:", response.text)
return
print("🔧 Gemini wants:", call.name, call.args)
result = await mcp_client.call_tool(call.name, call.args)
print("n📋 Tool result:n", result)
asyncio.get_event_loop().run_until_complete(run_gemini(37.7749, -122.4194))
Finally, this asynchronous function run_gemini opens the MCP client session during our internal transfer process, sends a natural language prompt to Gemini, requiring a 3-day prediction on the given coordinates to capture the result function call (if any) run_until_complete.
In short, we have a fully included pipeline that shows how to define custom MCP tools in Python, present them via FastMCP, and integrate them seamlessly with Google’s Gemini 2.0 model using the Google-Genai client. Key frameworks, FASTMCP for MCP hosting, FASTMCPTRANSPORT and MCPCLIENT for transport and call, HTTPX for external API access, and Nest_asyncio for COLAB compatibility, jointly enable real-time function calls without external processes or stdio pipelines. This pattern simplifies local development and testing of MCP integration in COLAB and provides a template for building more advanced proxy applications that combine LLM inference with dedicated domain tools.
This is COLAB notebook. Also, don’t forget to follow us twitter And join us Telegram Channel and LinkedIn GrOUP. Don’t forget to join us 90K+ ml reddit.
🔥 [Register Now] Minicon Agesic AI Virtual Conference: Free Registration + Certificate of Attendance + 4-hour Short Event (May 21, 9am-1pm) + Hands-On the Workshop
Asif Razzaq is CEO of Marktechpost Media Inc. As a visionary entrepreneur and engineer, ASIF is committed to harnessing the potential of artificial intelligence to achieve social benefits. His recent effort is to launch Marktechpost, an artificial intelligence media platform that has an in-depth coverage of machine learning and deep learning news that can sound both technically, both through technical voices and be understood by a wide audience. The platform has over 2 million views per month, demonstrating its popularity among its audience.
