|
| 1 | +# Dapr Conversation API (Java HTTP) |
| 2 | + |
| 3 | +In this quickstart, you'll send an input to a mock Large Language Model (LLM) using Dapr's Conversation API. This API is responsible for providing one consistent API entry point to talk to underlying LLM providers. |
| 4 | + |
| 5 | +Visit [this](https://docs.dapr.io/developing-applications/building-blocks/conversation/conversation-overview/) link for more information about Dapr and the Conversation API. |
| 6 | + |
| 7 | +> **Note:** This example leverages native HTTP client requests only. If you are looking for the example using the Dapr Client SDK (recommended) [click here](../sdk/). |
| 8 | +
|
| 9 | +This quickstart includes one app: |
| 10 | + |
| 11 | +- `ConversationApplication.java`, responsible for sending an input to the underlying LLM and retrieving an output. |
| 12 | + |
| 13 | +## Features Demonstrated |
| 14 | + |
| 15 | +This quickstart demonstrates: |
| 16 | + |
| 17 | +1. **Basic Conversation** - Send a simple message to an LLM and receive a response using the alpha2 API |
| 18 | +2. **Tool Calling** - Define tools/functions that the LLM can invoke, following OpenAI's function calling format |
| 19 | + |
| 20 | +### Tool Calling |
| 21 | + |
| 22 | +The conversation API supports advanced tool calling capabilities that allow LLMs to interact with external functions and APIs. This enables you to build sophisticated AI applications that can: |
| 23 | + |
| 24 | +- Execute custom functions based on user requests |
| 25 | +- Integrate with external services and databases |
| 26 | +- Provide dynamic, context-aware responses |
| 27 | + |
| 28 | +Tool calling follows [OpenAI's function calling format](https://developers.openai.com/api/docs/guides/function-calling), making it easy to integrate with existing AI development workflows. |
| 29 | + |
| 30 | +## Pre-requisites |
| 31 | + |
| 32 | +* [Dapr and Dapr Cli](https://docs.dapr.io/getting-started/install-dapr-cli/). |
| 33 | +* Java JDK 17 (or greater): |
| 34 | + * [Microsoft JDK 17](https://learn.microsoft.com/en-us/java/openjdk/download#openjdk-17) |
| 35 | + * [Oracle JDK 17](https://www.oracle.com/java/technologies/downloads/?er=221886#java17) |
| 36 | + * [OpenJDK 17](https://jdk.java.net/17/) |
| 37 | +* [Apache Maven](https://maven.apache.org/install.html) version 3.x. |
| 38 | + |
| 39 | +## Run the app with the template file |
| 40 | + |
| 41 | +Run the application using the [multi-app run template files](https://docs.dapr.io/developing-applications/local-development/multi-app-dapr-run/multi-app-overview/) and the Dapr CLI with dapr run -f .. |
| 42 | + |
| 43 | +This example uses the default LLM Component provided by Dapr which simply echoes the input provided, for testing purposes. Here are other [supported Conversation components](https://docs.dapr.io/reference/components-reference/supported-conversation/). |
| 44 | + |
| 45 | +### Build and run the Java application |
| 46 | + |
| 47 | +1. Navigate to the `conversation` directory and build the Java application: |
| 48 | + |
| 49 | +<!-- STEP |
| 50 | +name: Build Java file |
| 51 | +--> |
| 52 | + |
| 53 | +```bash |
| 54 | +cd ./conversation |
| 55 | +mvn clean install |
| 56 | +cd .. |
| 57 | +``` |
| 58 | + |
| 59 | +<!-- END_STEP --> |
| 60 | + |
| 61 | +2. Run the Java service app with Dapr using the multi app run template: |
| 62 | + |
| 63 | +<!-- STEP |
| 64 | +name: Run multi app run template |
| 65 | +expected_stdout_lines: |
| 66 | + - '== APP - conversation == === Basic Conversation Example ===' |
| 67 | + - '== APP - conversation == Input sent: What is dapr?' |
| 68 | + - '== APP - conversation == Output response: What is dapr?' |
| 69 | + - '== APP - conversation ==' |
| 70 | + - '== APP - conversation == === Tool Calling Example ===' |
| 71 | + - '== APP - conversation == Input sent: What is the weather like in San Francisco?' |
| 72 | + - '== APP - conversation == Tools defined: get_weather (location, unit)' |
| 73 | + - '== APP - conversation == LLM requested tool calls:' |
| 74 | + - '== APP - conversation == Tool ID: 0' |
| 75 | + - '== APP - conversation == Function: get_weather' |
| 76 | + - '== APP - conversation == Arguments: location,unit' |
| 77 | + - '== APP - conversation == Tool Result: {"temperature": 65, "unit": "fahrenheit", "description": "Sunny"}' |
| 78 | + - '== APP - conversation == ' |
| 79 | + - '== APP - conversation == Note: The echo component echoes input for testing purposes.' |
| 80 | + - '== APP - conversation == For actual tool calling, configure a real LLM component like OpenAI.' |
| 81 | +expected_stderr_lines: |
| 82 | +output_match_mode: substring |
| 83 | +match_order: none |
| 84 | +background: true |
| 85 | +sleep: 15 |
| 86 | +timeout_seconds: 30 |
| 87 | +--> |
| 88 | + |
| 89 | +```bash |
| 90 | +dapr run -f . |
| 91 | +``` |
| 92 | + |
| 93 | +The terminal console output should look similar to this: |
| 94 | + |
| 95 | +```text |
| 96 | +== APP - conversation == === Basic Conversation Example === |
| 97 | +== APP - conversation == Input sent: What is dapr? |
| 98 | +== APP - conversation == Output response: What is dapr? |
| 99 | +== APP - conversation == |
| 100 | +== APP - conversation == === Tool Calling Example === |
| 101 | +== APP - conversation == Input sent: What is the weather like in San Francisco? |
| 102 | +== APP - conversation == Tools defined: get_weather (location, unit) |
| 103 | +== APP - conversation == LLM requested tool calls: |
| 104 | +== APP - conversation == Tool ID: 0 |
| 105 | +== APP - conversation == Function: get_weather |
| 106 | +== APP - conversation == Arguments: location,unit |
| 107 | +== APP - conversation == Tool Result: {"temperature": 65, "unit": "fahrenheit", "description": "Sunny"} |
| 108 | +== APP - conversation == |
| 109 | +== APP - conversation == Note: The echo component echoes input for testing purposes. |
| 110 | +== APP - conversation == For actual tool calling, configure a real LLM component like OpenAI. |
| 111 | +``` |
| 112 | + |
| 113 | +<!-- END_STEP --> |
| 114 | + |
| 115 | +3. Stop and clean up application processes. |
| 116 | + |
| 117 | +<!-- STEP |
| 118 | +name: Stop multi-app run |
| 119 | +sleep: 5 |
| 120 | +--> |
| 121 | + |
| 122 | +```bash |
| 123 | +dapr stop -f . |
| 124 | +``` |
| 125 | + |
| 126 | +<!-- END_STEP --> |
| 127 | + |
| 128 | +## Run the app with the Dapr CLI |
| 129 | + |
| 130 | +1. Navigate to the `conversation` directory and build the Java application: |
| 131 | + |
| 132 | +```bash |
| 133 | +cd ./conversation |
| 134 | +mvn clean install |
| 135 | +``` |
| 136 | + |
| 137 | +2. Run the application with Dapr: |
| 138 | + |
| 139 | +```bash |
| 140 | +dapr run --app-id conversation --resources-path ../../../components -- java -jar target/ConversationService-0.0.1-SNAPSHOT.jar |
| 141 | +``` |
| 142 | + |
| 143 | +You should see the output: |
| 144 | + |
| 145 | +```bash |
| 146 | +== APP - conversation == === Basic Conversation Example === |
| 147 | +== APP - conversation == Input sent: What is dapr? |
| 148 | +== APP - conversation == Output response: What is dapr? |
| 149 | +== APP - conversation == |
| 150 | +== APP - conversation == === Tool Calling Example === |
| 151 | +== APP - conversation == Input sent: What is the weather like in San Francisco? |
| 152 | +== APP - conversation == Tools defined: get_weather (location, unit) |
| 153 | +== APP - conversation == LLM requested tool calls: |
| 154 | +== APP - conversation == Tool ID: 0 |
| 155 | +== APP - conversation == Function: get_weather |
| 156 | +== APP - conversation == Arguments: location,unit |
| 157 | +== APP - conversation == Tool Result: {"temperature": 65, "unit": "fahrenheit", "description": "Sunny"} |
| 158 | +== APP - conversation == |
| 159 | +== APP - conversation == Note: The echo component echoes input for testing purposes. |
| 160 | +== APP - conversation == For actual tool calling, configure a real LLM component like OpenAI. |
| 161 | +``` |
| 162 | + |
| 163 | +3. Stop the application: |
| 164 | + |
| 165 | +```bash |
| 166 | +dapr stop --app-id conversation |
| 167 | +``` |
| 168 | + |
| 169 | +## Tool Calling with a Real LLM |
| 170 | + |
| 171 | +To use actual tool calling functionality, configure a real LLM component (e.g., OpenAI, Anthropic). Here's an example OpenAI component configuration: |
| 172 | + |
| 173 | +```yaml |
| 174 | +apiVersion: dapr.io/v1alpha1 |
| 175 | +kind: Component |
| 176 | +metadata: |
| 177 | + name: openai |
| 178 | +spec: |
| 179 | + type: conversation.openai |
| 180 | + version: v1 |
| 181 | + metadata: |
| 182 | + - name: key |
| 183 | + value: "<your-openai-api-key>" |
| 184 | + - name: model |
| 185 | + value: "gpt-4" |
| 186 | +``` |
| 187 | +
|
| 188 | +Then update `CONVERSATION_COMPONENT_NAME` in the application to use your configured component. |
| 189 | + |
| 190 | +When using a real LLM with tool calling: |
| 191 | + |
| 192 | +1. The LLM analyzes the user request and available tools |
| 193 | +2. If a tool is needed, the response includes `finishReason: "tool_calls"` with tool call details |
| 194 | +3. Your application executes the tool and gets results |
| 195 | +4. Send results back to the LLM using an `ofTool` message to continue the conversation |
| 196 | + |
| 197 | +For more details, see the [Conversation API reference](https://docs.dapr.io/reference/api/conversation_api/). |
0 commit comments