-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
User data quota cron job #177
Comments
|
Thanks Austin. Once we determine our policy, let's also update the DANDI Handbook to add this information. |
After chatting with @yarikoptic, we probably should not be too ambitious with a cron job. All above should be run only prior to a migration (or kicked off manually if we want to reclaim some space). There may still be a good use for a cron job-- cleaning up files older than X in a provided scratch dir. |
I think we should indeed not cleanup anything automatically, especially since we are not tracking "access time" but only "modification time" on files - we can't make judgement if anything is still in use or not. What cron job should do is per user:
Something like that? please prepare design doc @asmacdo with above as a PR so we could iron out the behavior and then add it to ToS etc. |
Awesome, thanks @yarikoptic |
This information is kept, but it's tracked by jupyterhub itself. I'll look into connecting to the REST API directly |
IMO the choice of what files are safe to remove should come from the science side, its hard to guess what is safe to remove.
Heres an initial list:
Lets also provide each user with a scratch directory that cleans up files more than 30 days old.
The text was updated successfully, but these errors were encountered: