Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PerspectiveAPI Quota limit #4

Open
sauc-abadal opened this issue Oct 16, 2023 · 0 comments
Open

PerspectiveAPI Quota limit #4

sauc-abadal opened this issue Oct 16, 2023 · 0 comments

Comments

@sauc-abadal
Copy link

sauc-abadal commented Oct 16, 2023

Hi, I am trying to replicate some of the results but I am constantly exceeding the PerspectiveAPI quota limit. I noticed that you set your rate_limit to 135, but my granted quota entails a maximum of 60 queries per minute (1 QPS) so I changed the rate_limit parameter to 60.

However, when monitoring the responses from the requests (both in the google cloud platform and in the code) I am observing many 429 HTTP errors indicating that I am exceeding my quota, and thus, many LLM responses are associated with "null" toxicity scores.

How did you manage to stay below the quota limit? Did you ask for an increase? In that case, what rate did you request for?

Also, if I am understanding it correctly, the code is in charge of limiting the rate to 1 batch request per second (leading to a total of 60 batch requests per minute), where each batch comprises several responses amounting to a maximum of rate_limit responses, for which toxicity scores need to be computed. Does each batch request account as a single HTTP request or does each element in the batch contribute to the quota and account as individual requests?

Many thanks in advance,
S.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant