News from this site

 Rental advertising space, please contact the webmaster if you need cooperation


+focus
focused

classification  

no classification

tag  

no tag

date  

2024-11(5)

[Solved] flash-attn reports error flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol

posted on 2024-11-02 14:03     read(961)     comment(0)     like(10)     collect(1)


Error message :

ImportError: /home/operationgpt/anaconda3/envs/lyj_py10_torch230/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

Solution: Reinstall flash attention

  1. Uninstall the existing flash-attn, enter pip uninstall flash-attn, then enter y
  2. Check your corresponding torch version, cuda version and python version

Check the torch version

pip show torch

The following results are returned, indicating that the torch version is 2.3.1
insert image description here

Check the cuda version

nvcc -V

insert image description here
The cuda version is V12.5.40

Check Python version

python --version
  1. Go to the flash attention official website to download the installation package. Note that you need to choose according to your torch version, cuda version (you can choose a version lower than your cuda version) and python version. Also select abiFALSE.

insert image description here
Right click, copy the link, and use wget + link to download the whl installation package in Linux :

wget https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

Finally, use pip install whl path to download flash-attn, and you're done!



Category of website: technical article > Blog

Author:Disheartened

link:http://www.pythonblackhole.com/blog/article/245758/5301600fea62ee1e2d9a/

source:python black hole net

Please indicate the source for any form of reprinting. If any infringement is discovered, it will be held legally responsible.

10 0
collect article
collected

Comment content: (supports up to 255 characters)