-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing event back-pressure #173
Comments
Can you accomplish this by inverting the control to the frontend and have it request one batch at a time and figure out the correct parameters (batch size, time interval...whatever is appropriate for your application) each time? That seems like it would work with Reflex's primitives. From there, it seems like the main obstacle would be setting up the DOM primitives needed to calculate utilization / load average / responsiveness / etc. |
Doug, this was my first step in solving this, basically the client sends a message on the websocket "need next batch", the server sends it, the client updates the dynamic state with the batch inside the event (generated by the websocket). The |
@ababkin The "waiting" that you're describing sounds like very reasonable thing. Do you want the computation to be done during event propagation, or is it better if it happens on a separate thread? If you're able to put together a simple example, I'll be happy to take a look and see how we can make it as elegant as possible. |
@ababkin Could you not also have your processing function fire an event when it's done processing and use that to trigger a new batch? EDIT: It sounds like your processing function is pure, so this would require that you make an impure version that forces evaluation and fires an event when complete. You would use that event instead in your |
@3noch here is a PR for a function to do force evaluation, reflex-frp/reflex-dom#11 If this is what you were talking about and if its a useful API then we should merge it in |
@dfordivam Yes I think this is precisely what I had in mind. Thanks for pointing it out! |
I'm trying to implement a client-server app where server essentially pushes a lot (~1M) of tailed log entries onto the client, which consists of the lazy list and some filtering controls.
My issue is that, even though I've optimized things in the client, streaming that many messages reasonably quickly overwhelms the client (the CPU utilization grows as a function of log entries already received by the client and of course the rate of streaming) resulting gradually reducing visible responsiveness of the client's UI.
One way that allowed me to remediate this was rate limiting (on the server side) and sending batches of logs (1000-10000 every second). This however is suboptimal because hardcoded rate limit / batch size may not be ideal on another (slower or faster) computer. So my next idea is to somehow make client request the next batch once it had (completely or sufficiently) processed the previous one, i.e. client controlling it's own ingest rate (while cruising at some comfortable CPU utilization, to maintain responsiveness of the UI)
This however proved to be challenging, as I don't see a way in
reflex
to get any back-pressure due to processing load and use that to generate a particular event delay, for instance.How do people solve this problem currently?
(and please let me know if the above makes sense)
The text was updated successfully, but these errors were encountered: