Skip to content

Commit bf2c11d

Browse files
committed
Gemini API for Gemma 4
1 parent 58cf8ca commit bf2c11d

File tree

1 file changed

+32
-32
lines changed

1 file changed

+32
-32
lines changed

docs/agents/models/google-gemma.md

Lines changed: 32 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,35 @@ wide range of capabilities. ADK supports many Gemma features,
99
including [Tool Calling](/tools-custom/)
1010
and [Structured Output](/agents/llm-agents/#structuring-data-input_schema-output_schema-output_key).
1111

12-
You can use Gemma 4 through one of many self-hosting options on Google Cloud:
12+
You can use Gemma 4 through the [Gemini API](https://ai.google.dev/gemini-api/docs),
13+
or with one of many self-hosting options on Google Cloud:
1314
[Vertex AI](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/gemma4),
1415
[Google Kubernetes Engine](https://docs.cloud.google.com/kubernetes-engine/docs/tutorials/serve-gemma-gpu-vllm),
1516
[Cloud Run](https://docs.cloud.google.com/run/docs/run-gemma-on-cloud-run).
1617

18+
## Gemini API Example
19+
20+
Create an API key in [Google AI Studio](https://aistudio.google.com/app/apikey).
21+
22+
```python
23+
# Set GEMINI_API_KEY environment variable to your API key
24+
# export GEMINI_API_KEY="YOUR_API_KEY"
25+
26+
from google.adk.agents import LlmAgent
27+
from google.adk.models import Gemini
28+
29+
# Simple tool to try
30+
def get_weather(location: str) -> str:
31+
return f"Location: {location}. Weather: sunny, 76 degrees Fahrenheit, 8 mph wind."
32+
33+
root_agent = LlmAgent(
34+
model=Gemini(model="gemma-4-31b-it"),
35+
name="weather_agent",
36+
instruction="You are a helpful assistant that can provide current weather.",
37+
tools=[get_weather]
38+
)
39+
```
40+
1741
## vLLM Example
1842

1943
To access Gemma 4 endpoints in these services,
@@ -90,12 +114,13 @@ root_agent = LlmAgent(
90114
This sample shows how to build a personalized food tour agent using Gemma 4, ADK, and the Google Maps MCP server. The agent takes a user’s dish photo or text description, a location, and an optional budget, then recommends places to eat and organizes them into a walking route.
91115

92116
### Prerequisites
93-
- Deploy Gemma 4 using one of the options listed in the [vLLM Example](#vllm-example) section.
94-
Set `VLLM_API_BASE_URL` environment variable to the base URL of your deployed model (must end with `/v1`).
117+
118+
- Get an API key in [Google AI Studio](https://aistudio.google.com/app/apikey).
119+
Set `GEMINI_API_KEY` environment variable to your Gemini API key.
95120
- Enable [Google Maps API](https://console.cloud.google.com/maps-api/) on Google Cloud Console.
96121
- Create a [Google Maps Platform API key](https://console.cloud.google.com/maps-api/credentials).
97122
Set `MAPS_API_KEY` environment variable to your API key.
98-
- ADK installed and configured in your Python environment
123+
- Install ADK and configure it in your Python environment.
99124

100125
### Project structure
101126
```bash
@@ -108,10 +133,9 @@ food_tour_app/
108133
`agent.py`
109134
```python
110135
import os
111-
import subprocess
112136
import dotenv
113137
from google.adk.agents import LlmAgent
114-
from google.adk.models.lite_llm import LiteLlm
138+
from google.adk.models import Gemini
115139
from google.adk.tools.mcp_tool.mcp_toolset import MCPToolset
116140
from google.adk.tools.mcp_tool.mcp_session_manager import StreamableHTTPConnectionParams
117141

@@ -154,32 +178,8 @@ def get_maps_mcp_toolset():
154178

155179
maps_toolset = get_maps_mcp_toolset()
156180

157-
# Authentication (Example: using gcloud identity token for a Cloud Run deployment)
158-
# Adapt this based on your endpoint's security
159-
try:
160-
gcloud_token = subprocess.check_output(
161-
["gcloud", "auth", "print-identity-token", "-q"]
162-
).decode().strip()
163-
auth_headers = {"Authorization": f"Bearer {gcloud_token}"}
164-
except Exception as e:
165-
print(f"Warning: Could not get gcloud token - {e}.")
166-
auth_headers = None # Or handle error appropriately
167-
168181
root_agent = LlmAgent(
169-
model=LiteLlm(
170-
model="openai/google/gemma-4-31b-it",
171-
api_base=os.getenv("VLLM_API_BASE_URL"),
172-
# Pass authentication headers if needed
173-
extra_headers=auth_headers,
174-
# Alternatively, if endpoint uses an API key:
175-
# api_key="YOUR_ENDPOINT_API_KEY",
176-
extra_body={
177-
"chat_template_kwargs": {
178-
"enable_thinking": True # Enable thinking
179-
},
180-
"skip_special_tokens": False # Should be set to False
181-
},
182-
),
182+
model=Gemini(model="gemma-4-31b-it"),
183183
name="food_tour_agent",
184184
instruction=system_instruction,
185185
tools=[maps_toolset],
@@ -190,7 +190,7 @@ root_agent = LlmAgent(
190190
Set the required environment variables before running the agent.
191191
```
192192
export MAPS_API_KEY="YOUR_GOOGLE_MAPS_API_KEY"
193-
export VLLM_API_BASE_URL="YOUR_VLLM_API_BASE_URL"
193+
export GEMINI_API_KEY="YOUR_GEMINI_API_KEY"
194194
```
195195

196196
### Example usage

0 commit comments

Comments
 (0)