-
-
Notifications
You must be signed in to change notification settings - Fork 5.6k
Performance not scaling with multithreading #15874
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks @andreasnoack, also this is segfaulting for |
Just adding versioninfo() for reference:
|
I think this is not a dup. @abhijithch Didn't you use an older commit on julia master? Could you point out which commit you used, and is it possible to bisect and find where things broke? |
Oops - ignore. I see @ranjanan did post the commit. |
Cc @tanmaykm |
How many cores? |
30 cores with 2 hardware threads per core. |
There appear to be 3 different things here.
|
Given that each user will have different interactions with the items set, there shouldn't be much of cache thrashing. You can randomply permute the rows and columns of the R matrix and check if it changes performance. At each iteration, a submatrix is selected from R to work on. |
On the commit |
@ViralBShah: what I meant was, is it possible that each thread is bringing too much data into the cache and evicting other threads' data? |
I want to try multi-threading, but when I type nthreads() in the REPL, the |
@pingzou Just start Julia with e.g. |
@abhijithch Can you try this again and see where we are? |
Closing due to inactivity. |
Uh oh!
There was an error while loading. Please reload this page.
While running an example from recommender system package for movielens data, the performance of multithreaded parallel version does not seem to scale well.
To run the example download the dataset from here, unzip the files to any folder. Clone this repository and run the movielens example in julia built with multithreading,
test_thread(dataset_path)
method in the example scriptThe text was updated successfully, but these errors were encountered: