We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
negative_likelihood = torch.log(sigma + 1) + (z - mu) ** 2 / (2 * sigma ** 2) + 6 新人求教,高斯分布的负对数似然函数的计算公式中的6是怎么来的?sigma为什么要+1?谢谢
The text was updated successfully, but these errors were encountered:
6是随便设置的一个常识,是为了Likelihood不等于0.sigma+1也是为了不取0
Sorry, something went wrong.
@bbc7244,请发给我一份你已经已经调通的基本代码,多谢!我的邮箱地址是[email protected]
我训练的深度学习模型的目标函数也是高斯负对数似然,用于回归任务。目前遇到的问题是,损失不收敛,不知道是什么原因?有什么建议吗
No branches or pull requests
negative_likelihood = torch.log(sigma + 1) + (z - mu) ** 2 / (2 * sigma ** 2) + 6
新人求教,高斯分布的负对数似然函数的计算公式中的6是怎么来的?sigma为什么要+1?谢谢
The text was updated successfully, but these errors were encountered: