You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Aug 31, 2022. It is now read-only.
How do you want to prevent spamming the server list with fake servers? I added the API key method just because of that. I think some sort of authentication is required or maybe you can share secrets between game server and master server, but this would require obfuscation.
The keep alive check in v2 is done by sending the server information from game server to master server every 10s. It also does not require a lot of resources, also not for the master server, so splitting the information up into basic information + detailed information might be a bit over engineered. You will also lose capabilities of extended filtering on client side, because you would have to gather information from all kinds of servers. Imagine if you would want to add a filter “only display servers with at least x players”. This could no longer be done by making a single request.
For 3.0 I'd like to make the whole server thing more peer to peer. When you start a game it automatically starts a server and if the session is set to 'public' it automatically portforwards with upnp. All public sessions should appear in the server browser. With the need for api keys this wouldn't really work. Maybe setting a maximum amount of sessions per ip would mitigate fake servers.
Because the server list would potentially contain a lot more entries (because it's easier to host your own public session) I think it would be a good idea to change a few things to lower the requirements on bandwidth and memory.
The master doesn't need to get bombarded with 1kB per active session every 10 seconds. By reversing the roles and making the master server request those keep alives it can do so at its on pace and do the portcheck at the same time.
I'm not set on splitting the information up. But if the master server is getting overwhelmed splitting the info up could reduce the bandwidth requirements from ~100B/s to ~1B/s per session. The client and website can still display those information by requesting it from the servers directly and filter on the client side. That obviously doesn't sound like a lot but this is only for managing active sessions. You need additional upload bandwidth to serve the list to clients and website. That would multiply the combined download bandwidth by the number of requests. So 100 people requesting 100 servers would be 100 * 100B/s + 100 * 100 * 100B/s = 1.01MB/s vs 10.1KB/s
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
A server list similar to the one in use by Net64 2.0. Should / could be changed to:
The text was updated successfully, but these errors were encountered: