Skip to content

This repo collects list of papers targeting support ML training/inference on heterogeneous gpu cluster, which is a less studied field

Notifications You must be signed in to change notification settings

9Tempest/awesome-ML-heterogeneous-gpu-papers

Repository files navigation

Background and Motivation

ML system papers targeting efficient training on heterogeneous cluster(cluster with different types of devices) are less studied than homogeneous cluster(cluster with same type of devices). However, there is a growing interest in this area. The motivation of having heterogeneous cluster in distributed training are:

  1. for data centers, the use of heterogeneous GPUs is inevitable due to the short release cycle of new GPU architecture
  2. for users, they can purchase spot instance with a combination of available and cheap heterogeneous devices to reduce expense and failure's cost(when one type of device failed because of out-biling(bidding price is lower than spot price), the training can still continue on other types of devices).

We have categorized different challenges brought by heterogeneous devices and the corresponding solutions(papers) in the following sections. If you have any papers to add, feel free to ping me([email protected]).

Papers targeting inter-pipeline heterogeneity(each pipeline contains homogeneous devices, different pipelines have heterogeneous devices):

Main problem to solve: inter-pipeline heterogeneity leads to load imbalance.

Papers using batch distribution to balance the workload among pipelines

Papers using decentralized synchronization to improve overall throughput

Papers targeting intra-pipeline heterogeneity(A pipeline contains heterogeneous devices):

Main problem to solve: Within a pipeline, optimal layer assignment problem on heterogeneous devices is NP-hard with respective to the number of device types.

Other papers targeting heterogeneous cluster:

About

This repo collects list of papers targeting support ML training/inference on heterogeneous gpu cluster, which is a less studied field

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages