Skip to content

fix: align attention_mask padding with appended eos token in clvp#45757

Open
CharlieKerfoot wants to merge 1 commit intohuggingface:mainfrom
CharlieKerfoot:fix/when-add-eos-token-true
Open

fix: align attention_mask padding with appended eos token in clvp#45757
CharlieKerfoot wants to merge 1 commit intohuggingface:mainfrom
CharlieKerfoot:fix/when-add-eos-token-true

Conversation

@CharlieKerfoot
Copy link
Copy Markdown

Summary

In _pad_extra_bos_eos_tokens at src/transformers/models/clvp/modeling_clvp.py:144, when add_eos_token=True the eos token is appended to the right of input_ids but attention_mask was padded on the left. The mask no longer lines up with the tokens, so the appended eos position gets the wrong mask value and a real token at position 0 is masked out.

Fix

Pad attention_mask on the right (0, 1) instead of the left (1, 0) so the new mask entry corresponds to the appended eos token.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 3, 2026

[For maintainers] Suggested jobs to run (before merge)

run-slow: clvp

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant