Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss function nan when using AbsSquared activation #3

Open
twhughes opened this issue Dec 7, 2018 · 1 comment
Open

loss function nan when using AbsSquared activation #3

twhughes opened this issue Dec 7, 2018 · 1 comment
Labels
bug Something isn't working

Comments

@twhughes
Copy link
Member

twhughes commented Dec 7, 2018

running this mode

model_linear = neu.Sequential([
    neu.ClementsLayer(N),
    neu.Activation(neu.AbsSquared(N)),
    neu.DropMask(N, keep_ports=range(N_classes))
])

losses = neu.InSituAdam(model_linear, neu.CategoricalCrossEntropy, step_size=step_size).fit(x_train_flattened, y_train_onehot, epochs=n_epochs, batch_size=batch_size)

gives the warning:

  X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)
../neuroptica/neuroptica/losses.py:45: RuntimeWarning: invalid value encountered in true_divide
  X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)

And loss function is nan

When changing AbsSquared to Abs it works fine.

@twhughes twhughes added the bug Something isn't working label Dec 7, 2018
@ianwilliamson
Copy link
Contributor

Noting from our discussion earlier: this appears to be from the polar form of the the derivative whenever r=0.

@twhughes twhughes reopened this Dec 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants