You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Every search I perform against my local instance results in "too many requests" for the Google engine:
Relevant logs:
2025-01-14 01:51:45,267 WARNING:searx.network.google: HTTP Request failed: GET https://www.google.com/sorry/index?continue=https://www.google.com/search%3Fq%3Draducanu%26hl%3Den-US%26lr%3Dlang_en%26cr%3DcountryUS%26ie%3Dutf8%26oe%3Dutf8%26filter%3D0%26start%3D0%26asearch%3Darc%26async%3Duse_ac%253Atrue%252C_fmt%253Aprog&hl=en-US&q=EgR1eAkTGLCGl7wGIjBjIll3oMMeGexN4FK7RLulYKB8hWXjKMh2-ygGDcMnTI67t2lHJ3QeMeGFhibKBysyAXJKGVNPUlJZX0FCVVNJVkVfTkVUX01FU1NBR0VaAUM
2025-01-14 01:51:45,336 WARNING:searx.engines.google: ErrorContext('searx/search/processors/online.py', 116, "response = req(params['url'], **request_args)", 'searx.exceptions.SearxEngineTooManyRequestsException', None, ('Too many request',)) False
2025-01-14 01:51:45,336 ERROR:searx.engines.google: Too many requests
Traceback (most recent call last):
File "/usr/local/searxng/searx/search/processors/online.py", line 160, in search
search_results = self._search_basic(query, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/search/processors/online.py", line 144, in _search_basic
response = self._send_http_request(params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/search/processors/online.py", line 116, in _send_http_request
response = req(params['url'], **request_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/network/__init__.py", line 164, in get
return request('get', url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/network/__init__.py", line 95, in request
return future.result(timeout)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/usr/local/searxng/searx/network/network.py", line 291, in request
return await self.call_client(False, method, url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/network/network.py", line 274, in call_client
return self.patch_response(response, do_raise_for_httperror)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/searxng/searx/network/network.py", line 245, in patch_response
raise_for_httperror(response)
File "/usr/local/searxng/searx/network/raise_for_httperror.py", line 76, in raise_for_httperror
raise SearxEngineTooManyRequestsException()
searx.exceptions.SearxEngineTooManyRequestsException: Too many request, suspended_time=3600
Interestingly, the URL in the log looks like a redirect URL rather than the original - not sure if that's a bug or not. If I copy the URL as-is into my browser I get the expected:
If I try only the first part of the URL shown in the "sorry" page, it works (trying the whole thing succeeds but downloads a text file in a format I don't understand):
What might be wrong here? I've tried turning the limiter off just in case, but I get the same behavior.
The text was updated successfully, but these errors were encountered:
OK, I figured out what's happening. I'm running this on my NAS and it automatically routes through my VPN. If I try hitting Google locally with my VPN enabled I get:
If I run SearXNG on my dev machine with VPN disabled, I do get Google results:
I tried switching my VPN to a different server and then Google loaded, so it might be just that the server closest to me has a tainted reputation...? Perhaps a silly question and I know this is not really a SearXNG issue, but any suggestions on what to do here? Is my only option to use a different VPN server (or disable VPN altogether, which I won't do)?
The vast majority of my VPN's servers are marked as malicious by Google 😢 but I found a couple that are not. I switched my NAS to use that server (even though it's got quite the latency) and now the Google engine works as expected:
I'm honestly wondering if I should switch VPN providers at this point, but perhaps this is a common problem across all providers . . . if anyone has any thoughts, I'm eager to hear them.
Hi,
Every search I perform against my local instance results in "too many requests" for the Google engine:
Relevant logs:
Interestingly, the URL in the log looks like a redirect URL rather than the original - not sure if that's a bug or not. If I copy the URL as-is into my browser I get the expected:
If I try only the first part of the URL shown in the "sorry" page, it works (trying the whole thing succeeds but downloads a text file in a format I don't understand):
What might be wrong here? I've tried turning the limiter off just in case, but I get the same behavior.
The text was updated successfully, but these errors were encountered: