This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
Replies: 1 comment
-
i have tried to add the gelu activation setting still getting error activation_setting = { QuantizationSetting.register(torch.nn.GELU(), activation_setting)` |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
getting error
AssertionError: GELU is not registered, please register setting with QuantizationSetting.register()
tried with
from nni.compression.quantization import QuantizationSetting
but still no module availabe from compression it gives.
Beta Was this translation helpful? Give feedback.
All reactions