Skip to content

Error in transformer Clip model? #44

@Mr-Nobody-dey

Description

@Mr-Nobody-dey

Solution:
replace few lines with this

from transformers.modeling_attn_mask_utils import AttentionMaskConverter

mask_converter = AttentionMaskConverter(is_causal=True)

causal_attention_mask = mask_converter._make_causal_mask(input_shape, hidden_states.dtype, device=hidden_states.device)
attention_mask = mask_converter._expand_mask(attention_mask, hidden_states.dtype)

Note this solution works only when the error is regarding causal_attention_mask and attention_mask.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions