-
The paper about DoReFa-Net present 3 quantization steps ( gradient, weight, and [activation || output ]). Seems that only |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Exactly, weight & output & gradient are all quantized in paper. And both |
Beta Was this translation helpful? Give feedback.
Exactly, weight & output & gradient are all quantized in paper. And both
weight
andoutput
quantization have been supported in NNI'sDoReFa-Net
currently. We think the priority of implementing gradient quantization is low sinceDoReFa-Net
is not very suitable for deployment when comparing to another training aware quantization algorithm such as QAT.