[Scalability] 500+ concurrent SSH proxies? #634
-
This package looks great. I am looking for a way to proxy different incoming connections to many (let's say a top limit of 500) tunneled remote hosts on different ports. Without spawning 500 separate instances can proxy.py be used to handle many connections? Probably map ports 9xxx to 7xxx tunneled ports or something similar. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Last benchmark for With real world data served and upstream proxies, throughput will depend upon latency of upstream connection stream, but we should still be able to achieve multi-thousands of connection per second. So for 500 clients, until they all start making 20-30 request per second concurrently, a single Talking about
Let me know how it goes. Happy to jump back into it if necessary. |
Beta Was this translation helpful? Give feedback.
Last benchmark for
proxy.py
says it can pull off 16k connections per second.With real world data served and upstream proxies, throughput will depend upon latency of upstream connection stream, but we should still be able to achieve multi-thousands of connection per second.
So for 500 clients, until they all start making 20-30 request per second concurrently, a single
proxy.py
instance can serve them well.Talking about
SSH
proxies, there is an inbuilt support for that. I remember it used to work e2e when I tested last, but wasn't well polished/integrated/documented due tomypy
typings. If we ignore typing warning, SSH proxies worked fine IIRC.proxy.py
then acts like a localngrok
serve…