-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch-parallel K-means and K-medians #1288
Batch-parallel K-means and K-medians #1288
Conversation
Thank you for the PR! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1288 +/- ##
==========================================
+ Coverage 91.92% 92.06% +0.14%
==========================================
Files 79 80 +1
Lines 11721 11941 +220
==========================================
+ Hits 10774 10993 +219
- Misses 947 948 +1
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
… for storing the value of the K-Clustering functional after predict
Thank you for the PR! |
1 similar comment
Thank you for the PR! |
Thank you for the PR! |
… is now also possible in a hierarchical manor that scales better to high numbers of processes
Thank you for the PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mrfh92 this is great.
Would it be possible or advisable to make batch-parallel clustering available from within a general ht.cluster.KMeans
or ht.cluster.KMedians
call? I'm thinking something like:
kmeans = ht.cluster.KMeans(n_clusters=k, parallel="batch")
where parallel
can be "batch" or "standard".
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
Thank you for the PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great to me @mrfh92 , I only have a few edits to the documentation. Otherwise we can merge.
Co-authored-by: Claudia Comito <[email protected]>
Co-authored-by: Claudia Comito <[email protected]>
Co-authored-by: Claudia Comito <[email protected]>
Thank you for the PR! |
2 similar comments
Thank you for the PR! |
Thank you for the PR! |
@ClaudiaComito I have committed all suggested changes. |
Thank you for the PR! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👏 @mrfh92 !
Due Diligence
main
for new features, latest release branch (e.g.release/1.3.x
) for bug fixesDescription
This PR implements a variant of K-Means that can take advantage of Heat data distribution structure. The algorithm is described in the paper https://doi.org/10.1016/j.cie.2020.107023. In essence, the idea is to perform K-means on each chunk of data separately in parallel as a first step. In the second step, the obtained cluster centers are "merged": the set of all process-local cluster centers is collected, and then another application of K-means is used to determine the final "global" cluster centers als cluster centers of the set of "local" cluster centers.
Of course, this idea generalizes to K-Medians in a straightforward way. To improve scalability of this approach even to very high number of processes, the "merging" can also be done in hierarchical manor.
Caveat: This does not necessarily yield the same results as the classical K-Means.
Issue/s resolved: #1287
Changes proposed:
heat.cluster.BatchParallelKMeans
andheat.cluster.BatchParallelKMedians
Type of change
new feature
Memory requirements
TBD
Performance
Example of ~130GB on up to 8 GPU-nodes of the Terrabyte-Cluster. Left plot shows standard K-means (4 clusters) and right plot shows BatchParallelKmeans (4 and 40 clusters)
Does this change modify the behaviour of other functions? If so, which?
no