Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the QAT code #15

Open
JingdiZhou opened this issue Sep 20, 2023 · 0 comments
Open

About the QAT code #15

JingdiZhou opened this issue Sep 20, 2023 · 0 comments

Comments

@JingdiZhou
Copy link

JingdiZhou commented Sep 20, 2023

Hello , I noticed that your contribution has contained QAT in qat.py , but when I read the code of train.py ,I found there is no real quantization of each model. For example, when I using command

python qat.py --algo a2c --env BreakoutNoFrameskip-v4 -q 7 --quant-delay 5000000 -n 10000000

I found that in a2c.py w_bits and act_bits actually haven't been used in other places ,which means the training process is actually without quantization . Therefore, I want to know if I misunderstood or confirmed that there was no QAT?

    def __init__(self, policy, env, gamma=0.99, n_steps=5, vf_coef=0.25, ent_coef=0.01, max_grad_norm=0.5,
                 learning_rate=7e-4, alpha=0.99, epsilon=1e-5, lr_schedule='constant', verbose=0,
                 tensorboard_log=None, _init_setup_model=True, w_bits=None, act_bits=None, quant_train=None, quant_delay=None, policy_kwargs=None,
                 full_tensorboard_log=False, seed=None, n_cpu_tf_sess=None):
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant