Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a name for this pattern? #5

Open
jakearchibald opened this issue Sep 3, 2014 · 9 comments
Open

Is there a name for this pattern? #5

jakearchibald opened this issue Sep 3, 2014 · 9 comments

Comments

@jakearchibald
Copy link

  1. Async request to cache
    1. If successful, then
      1. If we've already updated the page using data fetch from the network (2.i.b), abort these steps
      2. Render content with fetched data
  2. Async request to network
    1. If successful, then
      1. Update cache with fetched data
      2. Render/update content with fetched data
    2. Else,
      1. Depend on previous request to cache
      2. If unsuccessful, display error

If not, we should name it.

@jakearchibald jakearchibald changed the title Is there a name for this pattern: Is there a name for this pattern? Sep 3, 2014
@michielbdejong
Copy link

What does 1.i.a mean? Why consult the cache at all in this case?

I think we mostly need to name 2.ii.a; the rest of it would just be called "caching", i would say.

would you keep retrying the failed request to network?

@jakearchibald
Copy link
Author

1.i.a prevents the page being updated with cached data after it's been updated with network data.

Yes, it's all caching, but it's a specific pattern where the page makes two requests, allowing cached data to be displayed first and updating with updated data later if needed.

would you keep retrying the failed request to network?

Maybe.

@michielbdejong
Copy link

Ah, ok. So the two requests always go in parallel? Isn't that benefiting very little from having the cache then? Unless the access to the cache is very slow, I would rather do:

  • Request latest cached data and its ETag from cache
  • Start a conditional network request using an If-None-Match header
  • Render the data from cache (if any)
  • Update cache and rerender if new data came in over the network
  • If no data was in the cache, and no data came in over the network either, display an error

That way at least you use the cache to save bandwidth. Maybe I misunderstood?

@jakearchibald
Copy link
Author

That's another option. Although we (Chrome) run into a lot of disk access issues on windows machines. Malware and virus scanners can make the disk slower than the network, so I'd rather race the two.

The fetch in step 2 would be via the browser cache, so there'll be bandwidth saving there.

@michielbdejong
Copy link

ok, got it. I don't know of a name for that. Maybe you could label it something along the lines of 'hit pessimism'. You could also call it 'miss hedging' but people will keep asking you who Ms. Hedging is. ;)

@slorber
Copy link

slorber commented Jun 13, 2015

I've thought about this too and would call it something like "cached preview"

@farskid
Copy link

farskid commented Oct 22, 2016

Nice pattern @jakearchibald , I'd also go with Cached Preview as @slorber mentioned

@PaulnOZ
Copy link

PaulnOZ commented Apr 23, 2017

The Great Cache Race, or Jake's Multicast Request, I like Miss Hedging, but cached preview doesn't really seem to describe the multiple request, miss hedging & updating that's going on.

@milahu
Copy link

milahu commented Nov 7, 2022

it's a specific pattern where the page makes two requests, allowing cached data to be displayed first and updating with updated data later if needed.

eager caching, lazy loading

obviously there should be UI feedback like "this is an old version, updating..."

edit: sorry for necrobumping. there repos should be archived ...

similar projects: https://github.com/topics/offline-first

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants