-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
prefetch: use a separate temporary cache for prefetching #730
Draft
skshetry
wants to merge
4
commits into
main
Choose a base branch
from
prefetch-cache
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
skshetry
force-pushed
the
prefetch-cache
branch
from
December 23, 2024 12:21
afae789
to
1266e4a
Compare
Deploying datachain-documentation with Cloudflare Pages
|
Deploying datachain-documentation with Cloudflare Pages
|
skshetry
force-pushed
the
prefetch-cache
branch
from
December 24, 2024 04:40
1266e4a
to
15c30fb
Compare
skshetry
force-pushed
the
prefetch-cache
branch
from
December 24, 2024 08:04
15c30fb
to
1b34bc0
Compare
skshetry
force-pushed
the
prefetch-cache
branch
from
December 24, 2024 16:37
1862bd0
to
90f1b7c
Compare
skshetry
force-pushed
the
prefetch-cache
branch
from
December 24, 2024 19:16
90f1b7c
to
0ee1da1
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #730 +/- ##
=======================================
Coverage 87.47% 87.48%
=======================================
Files 114 115 +1
Lines 10941 11033 +92
Branches 1504 1509 +5
=======================================
+ Hits 9571 9652 +81
- Misses 992 999 +7
- Partials 378 382 +4
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
skshetry
force-pushed
the
prefetch-cache
branch
from
December 25, 2024 07:53
0ee1da1
to
0ca4e5f
Compare
skshetry
force-pushed
the
prefetch-cache
branch
from
December 25, 2024 08:03
0ca4e5f
to
5770599
Compare
Also did: * minor refactor, * removes `nrows` _hack_ and, * disables prefetching when `nrows` is set, so that we don't download the whole dataset.
skshetry
force-pushed
the
prefetch-cache
branch
from
December 25, 2024 10:50
5770599
to
a58c8a3
Compare
skshetry
force-pushed
the
prefetch-cache
branch
from
December 25, 2024 11:43
a58c8a3
to
bb8cc22
Compare
`dvc_objects.fs.utils.remove()` works for readonly files too.
skshetry
force-pushed
the
prefetch-cache
branch
from
December 25, 2024 12:01
bb8cc22
to
acd168e
Compare
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR will use a separate temporary cache for prefetching that resides in
.datachain/tmp/prefetch-<random>
directory whenprefetch=
is set butcache
is not.The temporary directory will be automatically deleted after the prefetching is done.
For
cache=True
, the cache will be reused and won't be deleted.Please note that auto-cleanup does not work for PyTorch datasets because there is no way to invoke cleanup from the
Dataset
side. TheDataLoader
may still have cached data or rows even after theDataset
instance has finished iterating. As a result, values associated with acatalog
/cache
instance can outlive theDataset
instance.One potential solution is to implement a custom dataloader or provide a user-facing API.
In this PR, I have implemented the latter. The
PytorchDataset
now includes aclose()
method, which can be used to clean up the temporary prefetch cache.Eg: