Skip to content

Commit 58c8318

Browse files
authored
Java examples to call Gemma via the LangChain4j bridge (#1553)
* doc: A2A support in ADK Java * doc: fix spelling for the Agent2Agent protocol name * doc: addressing Kris' comments on the A2A Java documentation * docs: add examples for Gemma 4 with ADK Java
1 parent 66c1bb1 commit 58c8318

File tree

1 file changed

+176
-68
lines changed

1 file changed

+176
-68
lines changed

docs/agents/models/google-gemma.md

Lines changed: 176 additions & 68 deletions
Original file line numberDiff line numberDiff line change
@@ -19,30 +19,60 @@ or with one of many self-hosting options on Google Cloud:
1919

2020
Create an API key in [Google AI Studio](https://aistudio.google.com/app/apikey).
2121

22-
```python
23-
# Set GEMINI_API_KEY environment variable to your API key
24-
# export GEMINI_API_KEY="YOUR_API_KEY"
25-
26-
from google.adk.agents import LlmAgent
27-
from google.adk.models import Gemini
28-
29-
# Simple tool to try
30-
def get_weather(location: str) -> str:
31-
return f"Location: {location}. Weather: sunny, 76 degrees Fahrenheit, 8 mph wind."
32-
33-
root_agent = LlmAgent(
34-
model=Gemini(model="gemma-4-31b-it"),
35-
name="weather_agent",
36-
instruction="You are a helpful assistant that can provide current weather.",
37-
tools=[get_weather]
38-
)
39-
```
22+
=== "Python"
23+
```python
24+
# Set GEMINI_API_KEY environment variable to your API key
25+
# export GEMINI_API_KEY="YOUR_API_KEY"
26+
27+
from google.adk.agents import LlmAgent
28+
from google.adk.models import Gemini
29+
30+
# Simple tool to try
31+
def get_weather(location: str) -> str:
32+
return f"Location: {location}. Weather: sunny, 76 degrees Fahrenheit, 8 mph wind."
33+
34+
root_agent = LlmAgent(
35+
model=Gemini(model="gemma-4-31b-it"),
36+
name="weather_agent",
37+
instruction="You are a helpful assistant that can provide current weather.",
38+
tools=[get_weather]
39+
)
40+
```
41+
42+
=== "Java"
43+
```java
44+
// Set GEMINI_API_KEY environment variable to your API key
45+
// export GEMINI_API_KEY="YOUR_API_KEY"
46+
47+
import com.google.adk.agents.LlmAgent;
48+
import com.google.adk.tools.Annotations.Schema;
49+
import com.google.adk.tools.FunctionTool;
50+
51+
LlmAgent weatherAgent = LlmAgent.builder()
52+
.model("gemma-4-31b-it")
53+
.name("weather_agent")
54+
.instruction("""
55+
You are a helpful assistant that can provide current weather.
56+
""")
57+
.tools(FunctionTool.create(this, "getWeather")]
58+
.build();
59+
60+
@Schema(name = "getWeather",
61+
description = "Retrieve the weather forecast for a given location")
62+
public Map<String, String> getWeather(
63+
@Schema(name = "location",
64+
description = "The location for the weather forecast")
65+
String location) {
66+
return Map.of("forecast", "Location: " + location
67+
+ ". Weather: sunny, 76 degrees Fahrenheit, 8 mph wind.");
68+
}
69+
```
4070

4171
## vLLM Example
4272

4373
To access Gemma 4 endpoints in these services,
44-
you can use vLLM models through the [LiteLLM](/agents/models/litellm/) library
45-
for Python.
74+
you can use vLLM models through the [LiteLLM](/agents/models/litellm/) library for Python,
75+
and through [LangChain4j](https://docs.langchain4j.dev/) for Java.
4676

4777
The following example shows how to use a Gemma 4 vLLM endpoint with ADK agents.
4878

@@ -61,54 +91,131 @@ The following example shows how to use a Gemma 4 vLLM endpoint with ADK agents.
6191

6292
### Code
6393

64-
```python
65-
import subprocess
66-
from google.adk.agents import LlmAgent
67-
from google.adk.models.lite_llm import LiteLlm
68-
69-
# --- Example Agent using a model hosted on a vLLM endpoint ---
70-
71-
# Endpoint URL provided by your model deployment
72-
api_base_url = "https://your-vllm-endpoint.run.app/v1"
73-
74-
# Model name as recognized by *your* vLLM endpoint configuration
75-
model_name_at_endpoint = "openai/google/gemma-4-31B-it"
76-
77-
# Simple tool to try
78-
def get_weather(location: str) -> str:
79-
return f"Location: {location}. Weather: sunny, 76 degrees Fahrenheit, 8 mph wind."
80-
81-
# Authentication (Example: using gcloud identity token for a Cloud Run deployment)
82-
# Adapt this based on your endpoint's security
83-
try:
84-
gcloud_token = subprocess.check_output(
85-
["gcloud", "auth", "print-identity-token", "-q"]
86-
).decode().strip()
87-
auth_headers = {"Authorization": f"Bearer {gcloud_token}"}
88-
except Exception as e:
89-
print(f"Warning: Could not get gcloud token - {e}.")
90-
auth_headers = None # Or handle error appropriately
91-
92-
root_agent = LlmAgent(
93-
model=LiteLlm(
94-
model=model_name_at_endpoint,
95-
api_base=api_base_url,
96-
# Pass authentication headers if needed
97-
extra_headers=auth_headers
98-
# Alternatively, if endpoint uses an API key:
99-
# api_key="YOUR_ENDPOINT_API_KEY",
100-
extra_body={
101-
"chat_template_kwargs": {
102-
"enable_thinking": True # Enable thinking
94+
=== "Python"
95+
```python
96+
import subprocess
97+
from google.adk.agents import LlmAgent
98+
from google.adk.models.lite_llm import LiteLlm
99+
100+
# --- Example Agent using a model hosted on a vLLM endpoint ---
101+
102+
# Endpoint URL provided by your model deployment
103+
api_base_url = "https://your-vllm-endpoint.run.app/v1"
104+
105+
# Model name as recognized by *your* vLLM endpoint configuration
106+
model_name_at_endpoint = "openai/google/gemma-4-31B-it"
107+
108+
# Simple tool to try
109+
def get_weather(location: str) -> str:
110+
return f"Location: {location}. Weather: sunny, 76 degrees Fahrenheit, 8 mph wind."
111+
112+
# Authentication (Example: using gcloud identity token for a Cloud Run deployment)
113+
# Adapt this based on your endpoint's security
114+
try:
115+
gcloud_token = subprocess.check_output(
116+
["gcloud", "auth", "print-identity-token", "-q"]
117+
).decode().strip()
118+
auth_headers = {"Authorization": f"Bearer {gcloud_token}"}
119+
except Exception as e:
120+
print(f"Warning: Could not get gcloud token - {e}.")
121+
auth_headers = None # Or handle error appropriately
122+
123+
root_agent = LlmAgent(
124+
model=LiteLlm(
125+
model=model_name_at_endpoint,
126+
api_base=api_base_url,
127+
# Pass authentication headers if needed
128+
extra_headers=auth_headers
129+
# Alternatively, if endpoint uses an API key:
130+
# api_key="YOUR_ENDPOINT_API_KEY",
131+
extra_body={
132+
"chat_template_kwargs": {
133+
"enable_thinking": True # Enable thinking
134+
},
135+
"skip_special_tokens": False # Should be set to False
103136
},
104-
"skip_special_tokens": False # Should be set to False
105-
},
106-
),
107-
name="weather_agent",
108-
instruction="You are a helpful assistant that can provide current weather.",
109-
tools=[get_weather] # Tools!
110-
)
111-
```
137+
),
138+
name="weather_agent",
139+
instruction="You are a helpful assistant that can provide current weather.",
140+
tools=[get_weather] # Tools!
141+
)
142+
```
143+
144+
=== "Java"
145+
To use Gemma hosted on vLLM, you must use an OpenAI compatible library.
146+
LangChain4j offers an OpenAI dependency that you can add to your `pom.xml`:
147+
```xml
148+
<!-- LangChain4j to ADK bridge -->
149+
<dependency>
150+
<groupId>com.google.adk</groupId>
151+
<artifactId>google-adk-langchain4j</artifactId>
152+
<version>${adk.version}</version>
153+
</dependency>
154+
<!-- Core LangChain4j library -->
155+
<dependency>
156+
<groupId>dev.langchain4j</groupId>
157+
<artifactId>langchain4j-core</artifactId>
158+
<version>${langchain4j.version}</version>
159+
</dependency>
160+
<!-- OpenAI compatible model -->
161+
<dependency>
162+
<groupId>dev.langchain4j</groupId>
163+
<artifactId>langchain4j-open-ai</artifactId>
164+
<version>${langchain4j.version}</version>
165+
</dependency>
166+
```
167+
168+
Create an OpenAI compatible chat model (streaming or non-streaming),
169+
wrap it with the `LangChain4j` wrapper,
170+
then pass it to the `LlmAgent`:
171+
```java
172+
import com.google.adk.agents.LlmAgent;
173+
import com.google.adk.tools.Annotations.Schema;
174+
import com.google.adk.tools.FunctionTool;
175+
import dev.langchain4j.model.chat.StreamingChatModel;
176+
import dev.langchain4j.model.openai.OpenAiStreamingChatModel;
177+
178+
// Endpoint URL provided by your model deployment
179+
String apiBaseUrl = "https://your-vllm-endpoint.run.app/v1";
180+
181+
// Model name as recognized by *your* vLLM endpoint configuration
182+
String gemmaModelName = "gg-hf-gg/gemma-4-31b-it";
183+
184+
// First, define an OpenAI compatible chat model with LangChain4j
185+
StreamingChatModel model =
186+
OpenAiStreamingChatModel.builder()
187+
.modelName(gemmaModelName)
188+
// If your endpoint requires an API key
189+
// .apiKey("YOUR_ENDPOINT_API_KEY")
190+
.baseUrl(apiBaseUrl)
191+
.customParameters(
192+
Map.of(
193+
"skip_special_tokens", false,
194+
"chat_template_kwargs", Map.of("enable_thinking", true)
195+
)
196+
)
197+
.build();
198+
199+
// Configure the agent with the LangChain4j wrapper model
200+
LlmAgent weatherAgent = LlmAgent.builder()
201+
.model(new LangChain4j(model))
202+
.name("weather_agent")
203+
.instruction("""
204+
You are a helpful assistant that can provide the current weather.
205+
""")
206+
.tools(FunctionTool.create(this, "getWeather")]
207+
.build();
208+
209+
@Schema(name = "getWeather",
210+
description = "Retrieve the weather forecast for a given location")
211+
public Map<String, String> getWeather(
212+
@Schema(name = "location",
213+
description = "The location for the weather forecast")
214+
String location) {
215+
return Map.of("forecast", "Location: " + location
216+
+ ". Weather: sunny, 76 degrees Fahrenheit, 8 mph wind.");
217+
}
218+
```
112219

113220
## Build a food tour agent with Gemma 4, ADK, and Google Maps MCP
114221
This sample shows how to build a personalized food tour agent using Gemma 4, ADK, and the Google Maps MCP server. The agent takes a user’s dish photo or text description, a location, and an optional budget, then recommends places to eat and organizes them into a walking route.
@@ -120,7 +227,8 @@ This sample shows how to build a personalized food tour agent using Gemma 4, ADK
120227
- Enable [Google Maps API](https://console.cloud.google.com/maps-api/) on Google Cloud Console.
121228
- Create a [Google Maps Platform API key](https://console.cloud.google.com/maps-api/credentials).
122229
Set `MAPS_API_KEY` environment variable to your API key.
123-
- Install ADK and configure it in your Python environment.
230+
- Install ADK and configure it in your Python environment
231+
or configure the Java dependencies in your Java project.
124232

125233
### Project structure
126234
```bash

0 commit comments

Comments
 (0)