Skip to content

add target_modules to LoraConfig for selective LoRA fine-tuning #1230

@Enucatl

Description

@Enucatl

Self Checks

  • I have thoroughly reviewed the project documentation (installation, training, inference) but couldn't find any relevant information that meets my needs. English 中文 日本語 Portuguese (Brazil)
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing? Tell us your story.

Currently setup_lora applies LoRA adapters to all trainable parts of the model: attention layers, MLP layers, embeddings, and the output projection. There is no way to restrict LoRA to a subset of these.

This is limiting for a few reasons:

2. What is your suggested solution?

Add a target_modules field to LoraConfig with a default that covers all modules, preserving full backwards compatibility:

  @dataclass                                                                                                                                        
  class LoraConfig:
      r: int
      lora_alpha: float
      lora_dropout: float = 0.0
      # Valid values: "attention", "mlp", "embeddings", "output"                                                                                    
      target_modules: list = field(default_factory=lambda: ["attention", "mlp", "embeddings", "output"])

setup_lora then checks membership before applying each group. Example config for attention+MLP only:

  _target_: fish_speech.models.text2semantic.lora.LoraConfig                                                                                        
  r: 8                                                                                                                                              
  lora_alpha: 16
  lora_dropout: 0.01                                                                                                                                
  target_modules: [attention, mlp]                                                                                                                  

3. Additional context or comments

No response

4. Can you help us with this feature?

  • I am interested in contributing to this feature.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions