-
Notifications
You must be signed in to change notification settings - Fork 102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
create helper function to fetch data with concurrent requests #1954
create helper function to fetch data with concurrent requests #1954
Conversation
8c9001f
to
326cb11
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need the p-all
and p-map
dependency? In this case, we are dealing with just a simple que.
I am also a bit curious about why you need to limit the number of concurrent requests at all?
@Hyperkid123 thank you for having a look. I think they will be useful to avoid AKAMAI blocks when there are 10k items to be fetched. As you know if we do not limit concurrent requests, AKAMAI will block after a short time. Let me know if this is not needed. BTW, I writing tests |
Oh, one more question. Why is it that the UI requires 10k of records? Rendering anything with 10k items (table for example) is never going to end up with good UX unless the view is virtualized. |
Yup, this is one of the major issues on the UI that we face. When for example, a customer selects all existing advisories to be remediated, there is too much data to be fetched on the frontend to make it available for remediation app. Another example is when customers export reports. Maybe, I am exaggerating a bit, but I have seen sometimes this indeed happens. At the same time, I should mention that no table or UI component deals with so much data on the UI |
The ideal solution for the remediation issue would be to do changes in the architectural level, but it is requires a decent amount of effort. Discussions on this have been taken place even now. |
Yeah, I think the screens should be revisited then. I am not going to block your features, but I'd like it if the problem was properly sorted out at some point. These kinds of "optimizations" should never be required. If you need any help, let me know. |
326cb11
to
6054ba2
Compare
@Hyperkid123 One main area that we have been using these hooks or similar solutions are exports, like CSV, JSON and PDF. Another one is when we do batch actions, and we require data that needs to be submitted to an API later on, like a list of all host IDs. |
cfeee38
to
9fce682
Compare
@Hyperkid123 just wanted to remind you about this. |
@mkholjuraev the CI is still failing. EDIT: Oh I see. I restarted the job. See if its happy now or we need some adjustment. |
@mkholjuraev you did not push the lock file changes hence the CI failure
|
9fce682
to
715a242
Compare
@Hyperkid123 fixed. PTAL |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. I think we can merge and release it now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, there are some conflicts. Can we resolve them? @mkholjuraev otherwise, everything looks good
cbcf8a7
to
b38e178
Compare
@gkarat thanks for having a look. Conflicts are resolved now |
:soon::shipit::octocat: |
🌱 🌸 🌷 🌻 🌟 New version of package has been released 🌟 🌻 🌷 🌸 🌱 The release is available on: :package:@redhat-cloud-services/frontend-components-utilities/v/4.0.8📦 :boom:This feature is brought to you by probot🚀 |
This adds a simple, but useful helper function to utils package. It will help us to fetch a huge amount of data using paginated requests. Additionally, those paginated requests can be done in a certain number of concurrent API calls.