You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 28, 2018. It is now read-only.
I'd like to demonstrate a "back query" (feeding a desired output backwards through the network and getting pixels as a result) with this example. In @makeyourownneuralnetwork, the back query concept is explained and a sample image is provided (for the digit 0):
Perhaps I'm too impatient but after running through 10,000 training images, my backquery for zero looks like:
I think I'm missing something. Code in progress is in the backquery branch:
I'll take a look at the code tomorrow (but I'm no javascript expert).
What seems immediately suspect is that the learning doesn't seem to have completed sufficiently .. over time the shape should become "smooth" in the different colours. The top image looks like it is a result of a Gaussian blur - the bottom one doesn't yet.
Having said that even 10,000 training examples should be sufficient to have formed a good shape so something is off. Does 10,000 training examples (with whatever other parameters e.g. number of nodes, learning rate, etc) give you an overall performance of around 95% for the moist 10,000 test set? If yes.. this is puzzling.
I'd like to demonstrate a "back query" (feeding a desired output backwards through the network and getting pixels as a result) with this example. In @makeyourownneuralnetwork, the back query concept is explained and a sample image is provided (for the digit 0):
Perhaps I'm too impatient but after running through 10,000 training images, my backquery for zero looks like:
I think I'm missing something. Code in progress is in the
backquery
branch:https://github.com/shiffman/Neural-Network-p5/blob/backquery/nn.js#L127
The text was updated successfully, but these errors were encountered: