You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@jimzhu We are taking the output of VGG16 from a RELU layer, so it will be positive. After that, the bilinear product is also going to positive, and we can do regular sqrt.
Is it correct to use torch.sqrt(outputs + 1e-5) since outputs may have negative values? Thanks!
https://github.com/dasguptar/bcnn.pytorch/blob/master/bcnn/model.py#L32
outputs = torch.sign(outputs) * torch.sqrt(outputs + 1e-5) # signed square root normalization
The text was updated successfully, but these errors were encountered: