You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 6, 2023. It is now read-only.
where kernel_quantizer and kernel_constraint is set to None, so only the input is binarized while the weight is real-valued, is that expected?
I thought BinaryNet should also have binarized weight?
Best,
Junru
The text was updated successfully, but these errors were encountered:
Sandbox3aster
changed the title
Doubts on BinaryNet setting in section 5.2
BinaryNet setting in section 5.2
Mar 17, 2020
Hi Junru, thanks for your interest! When Bop is used as an optimizer, there is no need to set a kernel_quantizer, as Bop sets the underlying 'latent' weights to -1 or 1. You can check this by inspecting the weights after optimization (via tensorboard histograms, weight.numpy() or otherwise).
Hi Koen,
Thanks for the great work!
I noticed to reproduce BinaryNet on CIFAR-10 experiment of section 5.2, you use the setting as followed:
class bop_sec52(default):
epochs = 500
batch_size = 50
kernel_quantizer = None
kernel_constraint = None
threshold = 1e-8
gamma = 1e-4
gamma_decay = 0.1
decay_step = int((50000 / 50) * 100)
where kernel_quantizer and kernel_constraint is set to None, so only the input is binarized while the weight is real-valued, is that expected?
I thought BinaryNet should also have binarized weight?
Best,
Junru
The text was updated successfully, but these errors were encountered: