forked from dmlc/xgboost
-
Notifications
You must be signed in to change notification settings - Fork 0
Home
kalenhaha edited this page Mar 20, 2014
·
80 revisions
XGBoost(short for eXtreme Gradient Boosting) is an efficient general purpose gradient boosting (tree) library. Via easy configuration, we can use different boosting models and objective functions to fit the real world data. For quick start you can refer to the demo for regression and binary classification
The project have three layers, the listed files are commented headers that are useful to use code these layers.
- Booster core interface: booster/xgboost.h, booster/xgboost_data.h.
- Provides a interface to a single gradient boosters, all implements hides behind this interface
- Use this interface to add new implementation boosters, or use create booster to do specific tasks
- Booster ensemble base class: booster/xgboost_gbmbase.h
- Provides a base class that provides useful code for booster ensemble, provides the buffering scheme.
- Use this class to create customized learner with self-defined loss function. Take class GBMBaseModel, use Predict to get predictions and calculate gradient and second order gradient, put the statistics back to DoBoost to update the model.
- Booster task wrappers(TODO): regression, rank
- Provides direct wrapper to do specific learning tasks