Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove missing weights silencers in favor of HFQuantizer solution #1017

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

kylesayrs
Copy link
Collaborator

@kylesayrs kylesayrs commented Dec 28, 2024

Purpose

  • Add skip_missing_weights_context which is used when loading a quantized model whose state dict does not align model definition weights, ie quantized models (run_compressed or otherwise)

Changes

  • Remove init skipping from save_pretrained_wrapper. Init skipping is only useful when loading models, not saving
  • Remove transformer logging skipping from tests into skip_missing_weights_context, which is more reusable

Testing

  • TODO show example script
  • TODO investigate using from custom_offload_map

Signed-off-by: Kyle Sayers <[email protected]>
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

@kylesayrs kylesayrs marked this pull request as draft January 1, 2025 19:25
@kylesayrs kylesayrs changed the title Add skip_missing_weights_context Remove missing weights silencers in favor of HFQuantizer solution Jan 31, 2025
@kylesayrs
Copy link
Collaborator Author

This PR used to implement a skip_missing_weights_context, which silenced warnings about missing weights when loading a quantized model. Instead, @rahul-tuli will implement a more integrated solution into HF quantizer which silences these warnings directly, and this PR becomes simply the removal of the old code we had to silence these warnings

@kylesayrs
Copy link
Collaborator Author

@rahul-tuli's HFQuantizer changes must land before this lands

@kylesayrs kylesayrs self-assigned this Feb 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant