-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Suggestion for potential addition for better storage management. #9
Comments
I'm still going through the code so I apologise if I missed and mentioned something the package already does! |
You're right there's nothing to implictly delete old cached values against a given limit. As I use this package for one my project, I'm not too worried by the data size. I store many quite big JSON entities for a year period and the database is sized around 3-4MB when fully populated. FYI, this package is designed for small / medium data, not to store images and so on like we can do when navigating on web browsers. Despite this, each store calls This is a good suggestion. I'll think about this. |
Ah, I did not know it actually took only that much space!
Oh, so basically all stale values are deleted? I thought it was only done for a specific entry when it was checked and it turned out to be stale. I did feel I must've missed something in the code.
Makes sense. Plus if the package purges all stale data by itself anyway I guess it will always stay a manageable size in most cases, as you mentioned. |
Re-opening, this is still a valid request. |
Idea 1:
Maybe an option to specify the max amount of items (or size) the cache stores at a time, in the interceptor, Currently it seems like the package caches everything
and never deletes them (even in the case mentioned below), unless a new response overwrites it, which could lead to uncomfortable cache sizes for some apps.Idea 2:Delete the data from cache if it comes up as expired when checked.
Whoops sorry, wrote this while half asleep, confused maxAge and maxStale.
The text was updated successfully, but these errors were encountered: