You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When downloading all likes or a playlist, I think the program would benefit from a multithreaded approach. Something like adding an option --num-threads n that would partition the list in nths:
Of course, this would create a little bit of a data race with the already-downloaded-file, but that could be solved by creating n tempfiles that are later appended to the already-downloaded file (duplicates in playlists are not allowed on soundcloud:
)
The text was updated successfully, but these errors were encountered:
I'm not sure if the files are large enough to really make a difference, but another idea along those same lines is HTTP range requests - if the server supports RFC 7233 (partial content), split the file into chunks, then start downloading and merging the chunks into the destination file concurrently.
When downloading all likes or a playlist, I think the program would benefit from a multithreaded approach. Something like adding an option
--num-threads n
that would partition the list in nths:Of course, this would create a little bit of a data race with the already-downloaded-file, but that could be solved by creating n tempfiles that are later appended to the already-downloaded file (duplicates in playlists are not allowed on soundcloud:

)
The text was updated successfully, but these errors were encountered: