[AI] Add configurable model generation for AI On-Device #8043
[AI] Add configurable model generation for AI On-Device #8043
Conversation
Introduced `GenerationConfig`, `ModelConfig`, `ModelReleaseStage`, and `ModelPreference` to `firebase-ai-ondevice-interop` to allow for configurable model selection. Updated the internal `genaiPrompt` dependency to `1.0.0-beta2` to support the new configuration options. Deprecated the parameter-less `FirebaseAIOnDeviceGenerativeModelFactory.newGenerativeModel()` method in favor of a new overload that accepts a `GenerationConfig`.
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. |
📝 PRs merging into main branchOur main branch should always be in a releasable state. If you are working on a larger change, or if you don't want this change to see the light of the day just yet, consider using a feature branch first, and only merge into the main branch when the code complete and ready to be released. |
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces configuration options for the on-device generative model by adding GenerationConfig and ModelConfig classes to the interop layer. It updates the FirebaseAIOnDeviceGenerativeModelFactory to support these configurations, deprecating the parameterless model creation method. Additionally, it includes converter functions to bridge interop and MLKit types and updates the genaiPrompt dependency version. Feedback was provided regarding incorrect ReplaceWith expressions in deprecation annotations and a minor inconsistency in KDoc documentation.
…le/firebase/ai/ondevice/interop/FirebaseAIOnDeviceGenerativeModelFactory.kt Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
…ase/ai/ondevice/FirebaseAIOnDeviceComponent.kt Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
…le/firebase/ai/ondevice/interop/FirebaseAIOnDeviceGenerativeModelFactory.kt Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Introduced
GenerationConfig,ModelConfig,ModelReleaseStage, andModelPreferencetofirebase-ai-ondevice-interopto allow for configurable model selection.Updated the internal
genaiPromptdependency to1.0.0-beta2to support the new configuration options.Deprecated the parameter-less
FirebaseAIOnDeviceGenerativeModelFactory.newGenerativeModel()method in favor of a new overload that accepts aGenerationConfig.