Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError in check_for_empty_cluster #2

Open
kopytin opened this issue Mar 5, 2018 · 1 comment
Open

TypeError in check_for_empty_cluster #2

kopytin opened this issue Mar 5, 2018 · 1 comment

Comments

@kopytin
Copy link

kopytin commented Mar 5, 2018

Hello,

I am getting a TypeError in the current version of this module. Whether it appears or not depends on the number of clusters I request. On the same dataset, with 2 clusters requested I never see this error, with 4 clusters I see it sometimes, with 10 I always see it.

File "/usr/local/lib/python3.5/dist-packages/pyspark_kmodes/pyspark_kmodes.py", line 430, in fit
self.n_clusters,self.max_dist_iter)
File "/usr/local/lib/python3.5/dist-packages/pyspark_kmodes/pyspark_kmodes.py", line 271, in k_modes_partitioned
clusters = check_for_empty_cluster(clusters, rdd)
File "/usr/local/lib/python3.5/dist-packages/pyspark_kmodes/pyspark_kmodes.py", line 315, in check_for_empty_cluster
partition_sizes = cluster_sizes[n_clusters*(partition_index):n_clusters*(partition_index+1)]
TypeError: slice indices must be integers or None or have an index method

This is Spark 2.2.
Any ideas will be appreciated.

@cinqs
Copy link

cinqs commented Nov 6, 2018

Hey, did you try to replace the n_partitions with n_clusters

As I explained here:
#3

It seems this repo is no longer maintained by anyone...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants