Skip to content

Fix #22162: Add input validation for PReLU and LeakyReLU layers#22344

Open
thakoreh wants to merge 1 commit intokeras-team:masterfrom
thakoreh:fix/prelu-leakyrelu-validation
Open

Fix #22162: Add input validation for PReLU and LeakyReLU layers#22344
thakoreh wants to merge 1 commit intokeras-team:masterfrom
thakoreh:fix/prelu-leakyrelu-validation

Conversation

@thakoreh
Copy link

@thakoreh thakoreh commented Mar 4, 2026

Summary

Fixes #22162

Adds input validation for PReLU and LeakyReLU layers to catch invalid configurations at instantiation time rather than during training.

Changes

PReLU

  • Validates alpha_initializer is not None
  • Raises ValueError with clear message if None is passed

LeakyReLU

  • Validates negative_slope is not NaN
  • Adds check for math.isnan(negative_slope) for float values
  • Raises ValueError with clear message if NaN is passed

Rationale

Both cases violate the fail-fast principle. Without this validation:

  • PReLU with alpha_initializer=None would fail later during build() or training
  • LeakyReLU with negative_slope=NaN would produce non-numeric gradients

Tests Added

  • test_prelu_invalid_alpha_initializer_none - verifies PReLU rejects None
  • test_invalid_nan_negative_slope - verifies LeakyReLU rejects NaN

Example

Before:

layer = keras.layers.PReLU(alpha_initializer=None)  # Silent failure
layer.build((None, 10))  # Crash later

After:

layer = keras.layers.PReLU(alpha_initializer=None)
# ValueError: The `alpha_initializer` argument for PReLU cannot be None...

…yers

- PReLU: Validate alpha_initializer is not None
- LeakyReLU: Validate negative_slope is not NaN

Both cases now raise clear ValueError at instantiation time instead of
failing silently during training, following fail-fast principle.

Tests added:
- test_prelu_invalid_alpha_initializer_none
- test_invalid_nan_negative_slope
@google-cla
Copy link

google-cla bot commented Mar 4, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request enhances the robustness of Keras's PReLU and LeakyReLU activation layers by introducing crucial input validation. It ensures that common misconfigurations, such as a None initializer for PReLU or a NaN negative slope for LeakyReLU, are detected immediately upon layer instantiation. This proactive error checking prevents runtime issues and improves the overall stability and user experience of these layers.

Highlights

  • PReLU Layer Validation: Added input validation for the PReLU layer to ensure that alpha_initializer is not None, preventing silent failures that would occur later during the build() method or training.
  • LeakyReLU Layer Validation: Implemented input validation for the LeakyReLU layer to check if negative_slope is NaN, which previously could lead to non-numeric gradients during training.
  • Fail-Fast Principle: The changes adhere to the fail-fast principle by catching invalid layer configurations at instantiation time, providing clearer error messages and improving debugging.
  • New Test Cases: Added dedicated test cases (test_prelu_invalid_alpha_initializer_none and test_invalid_nan_negative_slope) to verify the new validation logic for both PReLU and LeakyReLU layers.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • keras/src/layers/activations/leaky_relu.py
    • Imported the math module to facilitate NaN checks.
    • Added a conditional check to raise a ValueError if negative_slope is a float and math.isnan returns true.
  • keras/src/layers/activations/leaky_relu_test.py
    • Introduced test_invalid_nan_negative_slope to assert that LeakyReLU raises a ValueError when initialized with negative_slope=float('nan').
  • keras/src/layers/activations/prelu.py
    • Added a conditional check to raise a ValueError if alpha_initializer is None.
  • keras/src/layers/activations/prelu_test.py
    • Introduced test_prelu_invalid_alpha_initializer_none to assert that PReLU raises a ValueError when initialized with alpha_initializer=None.
Activity
  • No specific activity (comments, reviews, or progress updates) has been recorded for this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces valuable input validation for the PReLU and LeakyReLU layers, ensuring that invalid configurations for alpha_initializer and negative_slope are caught at instantiation time with clear error messages. This aligns well with the fail-fast principle and improves the developer experience. The changes are correct and include corresponding unit tests. I have one minor suggestion to improve code consistency.

"Argument `alpha` is deprecated. Use `negative_slope` instead."
)
super().__init__(**kwargs)
import math
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For consistency with other files in the Keras codebase (e.g., keras/src/layers/layer.py, keras/src/regularizers/regularizers.py) and to adhere to general Python style guidelines (PEP 8), standard library imports like math should be placed at the top of the file. Please move this import to the top-level imports section.

@codecov-commenter
Copy link

codecov-commenter commented Mar 4, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 82.90%. Comparing base (f02d82b) to head (2b03b11).

Additional details and impacted files
@@           Coverage Diff           @@
##           master   #22344   +/-   ##
=======================================
  Coverage   82.90%   82.90%           
=======================================
  Files         594      594           
  Lines       65844    65849    +5     
  Branches    10293    10295    +2     
=======================================
+ Hits        54590    54595    +5     
  Misses       8638     8638           
  Partials     2616     2616           
Flag Coverage Δ
keras 82.73% <100.00%> (+<0.01%) ⬆️
keras-jax 60.95% <100.00%> (+<0.01%) ⬆️
keras-numpy 55.12% <100.00%> (+<0.01%) ⬆️
keras-openvino 49.10% <100.00%> (+<0.01%) ⬆️
keras-tensorflow 62.16% <100.00%> (+<0.01%) ⬆️
keras-torch 61.01% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Missing input validation for PReLU alpha_initializer and LeakyReLU alpha

3 participants