Kaggle_notebook_issue #1116
Unanswered
wissemkarous
asked this question in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My data is located on Kaggle, and I have transferred my notebook from Kaggle to GitHub. I want to leverage GitHub Actions for automated deployment. However, I am encountering issues in my YAML file. While I have attempted to use the Kaggle API to download the data during the deployment, the dataset is quite large, exceeding 9GB. This has proven challenging. Are there alternative approaches or solutions to efficiently handle the data transfer during deployment through GitHub Actions?
Beta Was this translation helpful? Give feedback.
All reactions