-
Notifications
You must be signed in to change notification settings - Fork 2.4k
Open
Description
OS: CentOS
CUDA: 12.8
python 3.13
torch: 2.8.0
I use https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.8cxx11abiTRUE-cp313-cp313-linux_x86_64.whl to install, then I get
ImportError: /root/VibeVoice/.venv/lib/python3.13/site-packages/flash_attn_2_cuda.cpython-313-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda29c10_cuda_check_implementationEiPKcS2_ib
how can I fix it?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels