You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Running download_model.py and getting ImportError: cannot import name 'Qwen2_5_VLForConditionalGeneration' from 'transformers'
You need only change the importing section of Qwen2VLForConditionalGeneration and line 22 as Qwen2VLForConditionalGeneration
Importing
From: from transformers import Qwen2_5_VLForConditionalGeneration, AutoProcessor
To this: from transformers.models.qwen2_vl.modeling_qwen2_vl import Qwen2VLForConditionalGeneration
Line 22
From: model = Qwen2_5_VLForConditionalGeneration.from_pretrained
To this: model = Qwen2VLForConditionalGeneration.from_pretrained
Modified version of code:
from transformers import AutoProcessor
from transformers.models.qwen2_vl.modeling_qwen2_vl import Qwen2VLForConditionalGeneration
import os
import torch
from accelerate import init_empty_weights
from accelerate.utils import load_and_quantize_model
MODEL_DIR = "models/Qwen2.5-VL-7B-Instruct"
def download_model():
print(f"Downloading model to {MODEL_DIR}...")
# Create directory if it doesn't exist
os.makedirs(MODEL_DIR, exist_ok=True)
# Download and save processor first
print("Downloading and saving processor...")
processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-7B-Instruct")
processor.save_pretrained(MODEL_DIR)
print("Downloading and saving model...")
# Initialize model with better memory handling
model = Qwen2VLForConditionalGeneration.from_pretrained(
"Qwen/Qwen2.5-VL-7B-Instruct",
torch_dtype=torch.float16,
device_map="auto",
offload_folder="offload", # Temporary directory for offloading
offload_state_dict=True, # Enable state dict offloading
low_cpu_mem_usage=True # Enable low CPU memory usage
)
print("Saving model...")
# Save with specific shard size to handle memory better
model.save_pretrained(
MODEL_DIR,
safe_serialization=True,
max_shard_size="2GB"
)
# Clean up offload folder if it exists
if os.path.exists("offload"):
import shutil
shutil.rmtree("offload")
print("Model downloaded and saved successfully!")
if __name__ == "__main__":
download_model()
Running download_model.py and getting
ImportError: cannot import name 'Qwen2_5_VLForConditionalGeneration' from 'transformers'
You need only change the importing section of Qwen2VLForConditionalGeneration and line 22 as Qwen2VLForConditionalGeneration
Importing
From:
from transformers import Qwen2_5_VLForConditionalGeneration, AutoProcessor
To this:
from transformers.models.qwen2_vl.modeling_qwen2_vl import Qwen2VLForConditionalGeneration
Line 22
From:
model = Qwen2_5_VLForConditionalGeneration.from_pretrained
To this:
model = Qwen2VLForConditionalGeneration.from_pretrained
Modified version of code:
Got some thoughts from huggingface/transformers#35569 (comment)
The text was updated successfully, but these errors were encountered: