Skip to content

Releases: ray-project/xgboost_ray

xgboost_ray-0.0.5

26 Apr 10:44
2b392d4
Compare
Choose a tag to compare
  • Added distributed callbacks called before/after train/data loading (#71)
  • Improved fault tolerance testing and benchmarking (#72)
  • Placement group fixes (#74)
  • Improved warnings/errors when using incompatible APIs (#76, #82, #84)
  • Enhanced compatibility with XGBoost 0.90 (legacy) and XGBoost 1.4 (#85, #90)
  • Better testing (#72, #87)
  • Minor bug/API fixes (#78, #83, #89)

xgboost_ray-0.0.4

18 Mar 20:45
147d15c
Compare
Choose a tag to compare
  • Add GCS support (Petastorm) (#63)
  • Enforce labels are set for train/evaluation data (#64)
  • Re-factor data loading structure, making it easier to add or change data loading backends (#66)
  • Distributed and locality-aware data loading for Modin dataframes (#67)
  • Documentation cleanup (#68)
  • Fix RayDeviceQuantileDMatrix usage (#69)

xgboost_ray-0.0.3

22 Feb 09:02
d5bca66
Compare
Choose a tag to compare
  • Added Petastorm integration (#46)
  • Improved Tune integration (#54, #55)
  • Fixed and improved tests (#40, #42, #47, #50, #52, #56, #60)
  • Compatibility with Ray client (#57)
  • Improved fault tolerance handling (#59)

xgboost_ray-0.0.2

12 Jan 12:12
1f7e450
Compare
Choose a tag to compare

Fix compatibility with python 3.8

xgboost_ray-0.0.1

05 Jan 10:04
1acfc85
Compare
Choose a tag to compare

Initial version of xgboost on ray, featuring:

  • Distributed training and predict support, tested on clusters of up to 600 nodes
  • Fault tolerance: Restarting the whole run from latest checkpoint if a node fails
  • Fault tolerance: Automatic scaling down/up when nodes die/become available again
  • Data loading from various sources (CSV, Parquet, Modin dataframes, Ray MLDataset, pandas, numpy)
  • Seamless integration with Ray Tune
  • Initial Ray placement group support