Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate Feature Importances #13

Open
gngdb opened this issue Nov 8, 2014 · 1 comment
Open

Evaluate Feature Importances #13

gngdb opened this issue Nov 8, 2014 · 1 comment

Comments

@gngdb
Copy link
Member

gngdb commented Nov 8, 2014

Need to be able to plot importances for features as an option. Unfortunately, only some models support this, such as Random Forests. Will have to decide the best way to do this.

@gngdb
Copy link
Member Author

gngdb commented Nov 9, 2014

Best way to do this is probably recursive feature elimination. However, the RFE implemented in scikit-learn is probably no good to us, as the weights or importances learned by the model may be irrelevant to the cross-validation score due to generalization problems.

What we really want to do is increase the cross-validation score so what we should do is eliminate a feature or features, retrain and observe the change in the cross-validation. Can then look at the two distributions of samples and decide whether or not that feature (or features) is useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant