Skip to content

How to support num workers > batch_size ? #10

Open
@kaixinbear

Description

@kaixinbear

Hi, etienne87:
I find that this repo would change num_worker to no larger than batch_size.(https://github.com/etienne87/pytorch-stream-dataloader/blob/master/pytorch_stream_dataloader/stream_dataloader.py#L45)
However, this would make the dataloader much slower than vannilar dataloader, which would make training time longer.
Do you have any suggestion on how to support num_workers just like original dataloader in pytorch ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions