Skip to content

[fix] Fix flash_attn_3 import order conflict#1375

Open
georgkaleido wants to merge 1 commit intofacebookresearch:mainfrom
georgkaleido:fix/flash3-import-order-conflict
Open

[fix] Fix flash_attn_3 import order conflict#1375
georgkaleido wants to merge 1 commit intofacebookresearch:mainfrom
georgkaleido:fix/flash3-import-order-conflict

Conversation

@georgkaleido
Copy link
Copy Markdown

What does this PR do?

Fixes #1348

Swap import order to prioritize global flash_attn_3 package over vendored version. This prevents torch extension namespace conflicts when flash_attn_3 is already imported by other libraries (e.g., diffusers).

The fix simply swaps the two import blocks so that:

  • Global flash_attn_3 is checked first (if)
  • Vendored version is only used if global is not available (elif)
  • Maintains the original behavior and error handling

This allows xFormers to work alongside libraries that import flash_attn_3, such as diffusers.models.autoencoders, without RuntimeError about duplicate torch.ops registration.

Before submitting

  • Did you have fun?
    • Make sure you had fun coding 🙃
  • Did you read the contributor guideline?
  • Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
    • N/A
  • Did you make sure to update the docs?
    • not needed
  • Did you write any new necessary tests?
    • not easy as there is no released flash_attention_3 package
  • Did you update the changelog? (if needed)
    • N/A

Fixes facebookresearch#1348

Swap import order to prioritize global flash_attn_3 package over vendored
version. This prevents torch extension namespace conflicts when flash_attn_3
is already imported by other libraries (e.g., diffusers).

The fix simply swaps the two import blocks so that:
- Global flash_attn_3 is checked first (if)
- Vendored version is only used if global is not available (elif)
- Maintains the original behavior and error handling

This allows xFormers to work alongside libraries that import flash_attn_3,
such as diffusers.models.autoencoders, without RuntimeError about duplicate
torch.ops registration.

Amp-Thread-ID: https://ampcode.com/threads/T-353ac1bf-891c-41c9-96d3-0b1b4c2b5bbe
Co-authored-by: Amp <amp@ampcode.com>
@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Flash Attention 3 torch extension namespace conflict

1 participant