Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove experimental SmoothedHingeLoss training task #455

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

ashelkovnykov
Copy link
Contributor

  • Experimental results when comparing Smoothed Hinge Loss to Logistic Regression were substantially worse and also increased training time significantly due to slower convergence
  • Removed GLMLossFunction and derived classes, since all training tasks in Photon ML are GLM tasks

Alex Shelkovnykov and others added 2 commits May 7, 2020 11:53
- Experimental results when comparing Smoothed Hinge Loss to Logistic Regression were substantially worse and also reduced training time significantly due to slower convergence
- Removed GLMLossFunction and derived classes, since all training tasks in Photon ML are GLM tasks
@ashelkovnykov
Copy link
Contributor Author

@cmjiang @yunboouyang @lguo
Updated code for latest master, requesting review

Copy link
Contributor

@yunboouyang yunboouyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -52,6 +52,8 @@ object SparkSessionConfiguration {
classOf[GeneralizedLinearModel],
classOf[GeneralizedLinearOptimizationProblem[_]],
classOf[GLMOptimizationConfiguration],
classOf[HessianDiagonalAggregator],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need to add this class?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed that we had the ValueAndGradientAggregator and HessianVectorAggregator being registered with Kryo, but not the HessianDiagonalAggregator or the HessianMatrixAggregator. Since all four function the same way, it stands to reason that they should either all be registered or none of them be registered. Since objects of these four class types will for sure be distributed, therefore they should all be registered.

Apparently, Kryo was able to figure this out on its own, but I think it best to make it explicit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants