Skip to content

Batchsize in Finetuning is irrelevant #21

@phquanta

Description

@phquanta

I've noticed that BatchSize even it is there for Fine Tuning it is always 1 to make sure it functions properly. If one wants to set it bigger than 1, then it triggers an error due to a fact that self.max_len=0 and no padding takes place. I don't know how it would affect training if one uses max_len vs not using max_len with batchsize=1.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions