Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There are some problems to puzzle me? I hope you can explain these, please. #21

Open
angleboy8 opened this issue Dec 27, 2018 · 0 comments

Comments

@angleboy8
Copy link

There are some problems as follows.

  1. In the line 177 of "model.py", I think that self.gener_loss should be divided by float(len(scale_weight.keys())). Thus we have the whole average value of the generator. Note that self.gener_acc also gets its mean value by this way. In addition, the discriminator has the same problem in the line 149, self.discr_loss should be divided by float(len(scale_weight.keys())*3).

  2. How do every random batch_size work?
    I think initialize_batch_worker(*) in “prepare_dataset.py” put training data into a queue continuously. Then q_art.get() in “model.py” get training data. The training process of a batch size is finished, and then it continue to repeat the previous process. However, I don't know whether this understanding is correct.

I appreciate your work. I'm very grateful for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant