Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 downloader has a hardcoded limit of 100 files #85

Open
slik13 opened this issue Apr 16, 2024 · 0 comments
Open

S3 downloader has a hardcoded limit of 100 files #85

slik13 opened this issue Apr 16, 2024 · 0 comments

Comments

@slik13
Copy link

slik13 commented Apr 16, 2024

Currently the S3 downloader has a hardcoded limit of 100 files (see: https://github.com/kserve/modelmesh-runtime-adapter/blob/2d5bb69e9ed19efd74fbe6f8b76ec2e970702e3c/pullman/storageproviders/s3/downloader.go#L79C3-L79C27).

This means that any model that contains more than 100 files gets cut off at that arbitrary file and causes the model to only be partially copied and then subsequently fails at runtime. For example, when using a model like the argos-translate model with many languages you can exceed the 100 file limit.

1.7132796006805687e+09	DEBUG	Triton Adapter.Triton Adapter Server	found objects to download	{"type": "s3", "cacheKey": "s3|xyz", "path": "translate", "count": 100}

This limit seems arbitrary and should be made configurable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant