Self Checks
1. Is this request related to a challenge you're experiencing? Tell us your story.
Currently setup_lora applies LoRA adapters to all trainable parts of the model: attention layers, MLP layers, embeddings, and the output projection. There is no way to restrict LoRA to a subset of these.
This is limiting for a few reasons:
2. What is your suggested solution?
Add a target_modules field to LoraConfig with a default that covers all modules, preserving full backwards compatibility:
@dataclass
class LoraConfig:
r: int
lora_alpha: float
lora_dropout: float = 0.0
# Valid values: "attention", "mlp", "embeddings", "output"
target_modules: list = field(default_factory=lambda: ["attention", "mlp", "embeddings", "output"])
setup_lora then checks membership before applying each group. Example config for attention+MLP only:
_target_: fish_speech.models.text2semantic.lora.LoraConfig
r: 8
lora_alpha: 16
lora_dropout: 0.01
target_modules: [attention, mlp]
3. Additional context or comments
No response
4. Can you help us with this feature?
Self Checks
1. Is this request related to a challenge you're experiencing? Tell us your story.
Currently
setup_loraapplies LoRA adapters to all trainable parts of the model: attention layers, MLP layers, embeddings, and the output projection. There is no way to restrict LoRA to a subset of these.This is limiting for a few reasons:
Skipping them reduces memory usage and speeds up training without necessarily hurting quality.
instance, speaker adaptation does not require changing what tokens mean, only how the model processes them.
causing an AttributeError (see setup_lora() crashes with AttributeError when tie_word_embeddings=True (default) #1195, fix(lora): skip output layer when tie_word_embeddings=True #1210, fix: check for output attribute before adding LoRA layer #1213, fix:AttributeError during LoRA training #1220). A target_modules field that guards access behind membership checks fixes this as a
side effect.
2. What is your suggested solution?
Add a target_modules field to LoraConfig with a default that covers all modules, preserving full backwards compatibility:
setup_lorathen checks membership before applying each group. Example config for attention+MLP only:3. Additional context or comments
No response
4. Can you help us with this feature?