You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is it possible to use an S3 bucket when the model won't fit directly into a lambda layer? Resnet18 is quite small and not typical of DL models these days.
The text was updated successfully, but these errors were encountered:
Yes, S3 is possible but you would have to build your own torchlambda Docker image (see torchlambda build command documentation, specifically --aws-components "s3" should be passed during building).
Also you would have to modify example C++ template source code to use Aws::S3::S3Client. C++ template can be created using torchlambda template which generates human readable C++ code for further extensions. Example of using Aws::S3::S3Client can be seen here, admittedly this code can be a lot of trouble.
Currently there is no .yml option to do so and I am open for pull requests, especially addition of this part would be great. If you are up for it I will guide you and help if needed. Also remember you might be blocked by VRAM memory provided by Lambda instances which is 3GB IIRC.
Another option would be to split your model into 4 parts, each of around 50Mb and call each on the output of the previous. This would also require custom C++ code, I would be more than happy with such PR as well (and this approach should be easier to code). This would allow users to use multiple Lambda layers for single model which would up the maximum model size to around ~200Mb.
Is it possible to use an S3 bucket when the model won't fit directly into a lambda layer? Resnet18 is quite small and not typical of DL models these days.
The text was updated successfully, but these errors were encountered: