Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeepSeek-V2-Lite-Chat模型启动依赖问题 #78

Open
Malowking opened this issue Aug 2, 2024 · 1 comment
Open

DeepSeek-V2-Lite-Chat模型启动依赖问题 #78

Malowking opened this issue Aug 2, 2024 · 1 comment

Comments

@Malowking
Copy link

DeepSeek-V2-Lite-Chat模型必须使用flash_attn这个包么,如果不必要该怎么指定禁用flash_attn 这个包

@itaowei
Copy link

itaowei commented Sep 5, 2024

I have the same question.
May I ask which version of flash_attn is required?
I installed the latest version (2.6.3) but it will make the error below:

Traceback (most recent call last):
  File "test.py", line 335, in <module>
    model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True, device_map="sequential", 
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/workspace/anaconda3/envs/deepseek/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 551, in from_pretrained
    model_class = get_class_from_dynamic_module(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/workspace/anaconda3/envs/deepseek/lib/python3.12/site-packages/transformers/dynamic_module_utils.py", line 502, in get_class_from_dynamic_module
    final_module = get_cached_module_file(
                   ^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/workspace/anaconda3/envs/deepseek/lib/python3.12/site-packages/transformers/dynamic_module_utils.py", line 327, in get_cached_module_file
    modules_needed = check_imports(resolved_module_file)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/workspace/anaconda3/envs/deepseek/lib/python3.12/site-packages/transformers/dynamic_module_utils.py", line 182, in check_imports
    raise ImportError(
ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants