posted on 2024-11-02 14:03 read(961) comment(0) like(10) collect(1)
ImportError: /home/operationgpt/anaconda3/envs/lyj_py10_torch230/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
pip uninstall flash-attn
, then enter yCheck the torch version
pip show torch
The following results are returned, indicating that the torch version is 2.3.1
Check the cuda version
nvcc -V
The cuda version is V12.5.40
Check Python version
python --version
Right click, copy the link, and use wget + link to download the whl installation package in Linux :
wget https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
Finally, use pip install whl path to download flash-attn, and you're done!
Author:Disheartened
link:http://www.pythonblackhole.com/blog/article/245758/5301600fea62ee1e2d9a/
source:python black hole net
Please indicate the source for any form of reprinting. If any infringement is discovered, it will be held legally responsible.
name:
Comment content: (supports up to 255 characters)
Copyright © 2018-2021 python black hole network All Rights Reserved All rights reserved, and all rights reserved.京ICP备18063182号-7
For complaints and reports, and advertising cooperation, please contact vgs_info@163.com or QQ3083709327
Disclaimer: All articles on the website are uploaded by users and are only for readers' learning and communication use, and commercial use is prohibited. If the article involves pornography, reactionary, infringement and other illegal information, please report it to us and we will delete it immediately after verification!