You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Need to be able to plot importances for features as an option. Unfortunately, only some models support this, such as Random Forests. Will have to decide the best way to do this.
The text was updated successfully, but these errors were encountered:
Best way to do this is probably recursive feature elimination. However, the RFE implemented in scikit-learn is probably no good to us, as the weights or importances learned by the model may be irrelevant to the cross-validation score due to generalization problems.
What we really want to do is increase the cross-validation score so what we should do is eliminate a feature or features, retrain and observe the change in the cross-validation. Can then look at the two distributions of samples and decide whether or not that feature (or features) is useful.
Need to be able to plot importances for features as an option. Unfortunately, only some models support this, such as Random Forests. Will have to decide the best way to do this.
The text was updated successfully, but these errors were encountered: