-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance of queries that return large number of metrics #1058
Comments
Hello @JarleB, |
#1010 is already merged into 0.9.x btw 10:02, пн, 22.12.2014, JarleB [email protected]:
|
Gonna mark this closed. @JarleB if you try 0.9.x and find that it doesn't address your issues, please reopen. Thanks! |
@JarleB Awesome, thanks for the update. |
Hi @obfuscurity. I'm working with current master branch. Are these two parches medged on the current master branch ? |
@toni-moreno - not yet. Master is sufficiently different that simply cross-merging won't do it, they need to be rewritten :-( |
Lately I've been building some custom graphs that fetches a larger than usual (for me anyway) subset of metrics in our graphite cluster. I'm observing huge difference in time to retrieve about 1500 metrics if I go through the frontend graphite server, or if I go directly on the backend graphite server that I know the same data is located.
An example to illustrate:
What happens is the following:
When the query is sent through the front end it hits the metric cache (same query was run just before, so cache is containing all metric locations) and instantly starts asking for single metrics from the back end server and when the backend has served up all 1678 requests, the front end returns the data to the client.
When querying the back end directly all work is done locally, which takes considerably shorter time, and replies with the same data in less than 1/10 of the time.
I wonder what would be the better approach to handle this? I'm thinking that a better approach would be to pass the unmodified urls with regexes to all backends, and collect and merge the returned data..
The text was updated successfully, but these errors were encountered: