You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ofirgo
changed the title
Activation memory estimation (in resource utilization) ignores layers the activation quantization disabled
Activation memory estimation (in resource utilization) ignores layers where the activation quantization disabled
Jan 21, 2025
Issue Type
Bug
Source
source
MCT Version
nightly
OS Platform and Distribution
No response
Python version
No response
Describe the issue
Currently unquantized vectors size are ignored in MaxTensor and MaxCut. Need to handle according to the quantization preserving flag.
Expected behaviour
No response
Code to reproduce the issue
Log output
No response
The text was updated successfully, but these errors were encountered: