-
Notifications
You must be signed in to change notification settings - Fork 603
Open
Description
Bug Report
- Summary: The Unsloth-generated
Linear_peft_forward.pyreferencesVARIANT_KWARG_KEYSwhen constructingvariant_kwargs, but the constant is never imported or defined, so every LoRA Linear forward immediately crashes withNameError. - Environment: Python 3.10, torch ≥2.8.0, transformers 4.56.2, latest
unsloth/unsloth_zoofrom Git (ROCm workstation, but this is a pure-Python failure). - GPU: AMD RYZEN AI MAX+ 395 w/ Radeon 8060S, rocm 7.0+,
- Steps to Reproduce:
- Generate any LoRA-wrapped model with Unsloth (e.g., run
examples/gpt_oss_(20B)_Reinforcement_Learning_2048_Game_BF16.py). - When the decoder calls a LoRA projection,
unsloth_forwardinLinear_peft_forward.pyexecutes. - The function evaluates
variant_kwargs = {k: kwargs.pop(k, None) for k in VARIANT_KWARG_KEYS}and Python raisesNameError, halting the run.
- Generate any LoRA-wrapped model with Unsloth (e.g., run
- Minimal Reproduction (standalone):
import textwrap, types, torch
LINEAR_SOURCE = textwrap.dedent(
"""
import torch
def unsloth_forward(self, x, *args, **kwargs):
variant_kwargs = {k: kwargs.pop(k, None) for k in VARIANT_KWARG_KEYS}
return self.base_layer(x, *args, **kwargs)
"""
)
mod = types.ModuleType("broken_linear")
exec(LINEAR_SOURCE, mod.__dict__)
class Dummy:
disable_adapters = False
merged = False
active_adapters = []
lora_A = lora_B = lora_dropout = {}
scaling = {}
lora_variant = {}
def base_layer(self, x, *args, **kwargs):
return x
mod.unsloth_forward(Dummy(), torch.zeros(1, 1))Running this script reproduces the NameError.
- Expected Result:
variant_kwargsshould be built successfully so the base layer forward completes. - Actual Result:
NameErrorprevents any LoRA projection from running. - Proposed Fix: Have the generator import
VARIANT_KWARG_KEYSfrompeft.tuners.lora.layer, fallback topeft.tuners.lora.bnb, or default to["alora_offsets"]if neither import is available.
Metadata
Metadata
Assignees
Labels
No labels