Forum Discussion

DoYaSeeMe's avatar
5 years ago

Tickrate Stuff

I think it's kind of safe to say that the tickrate for Apex is around 60Hz. Clients send their updates 60 times per second, while servers send game data up to 60 times per second. 

"Wait, but everyone says that the potato servers are at 20Hz, why are you lying?!?" 20Hz, or 20 times per second is the minimum limit for the server to send an entire game update to the clients. The thing is, game updates are big for Apex, so they need to be split into multiple packets and sent throughout multiple ticks.

"Raise it to 120 - 28Hz, what are you waiting for?!?" Doubling the data rate simply requires twice the bandwidth and probably twice as many servers. This doesn't just double the bill, it also makes the game more unstable, with more packets dropped or lost. All this while the majority of players don't actually have devices that support more than 60Hz (some may even be at 30Hz).

"Ok then, reduce the data, make an update fit into a single packet.or at least twice as few as it is now" I am pretty sure that devs continuously work on optimizing this, but we are asking for a 2x-5x reduction, while the game keeps adding stuff. The data that's being sent is also pretty basic: coordinates, action id's, animation states,flags. Nothing that can be cut without getting serious issues like players teleporting or stuff vanishing. Using a very powerful compression can probably make that data smaller, but all that gain gets lost an then some, as both the server and the clients need time & power to compress and then decompress.

"Damn, you're so negative! They should increase the packet size then, bam!" I believe the size limitations are set to meet the network standards. There are many tons of old and low quality hardware across the millions of miles of wire, they are to blame. Upgrading all these is probably a matter of trillions of $.

Disambiguation

As the gaming world evolves, there is more and more confusion surrounding tick rate. This is because, in the past, a full game state was able to fit inside a single tick, while the frequency was low enough to give a decent time interval to process a frame, for most machines. It didn't matter if someone used update rate, snapshot rate, simulation rate, server frame rate or tick rate. In the recent years though, more and more complex multiplayer games appeared and, as a consequence, the amount of data constituting a game state grew beyond what can be sent within a single tick. In parallel, hardware evolved as well and pushed framerates way beyond data rates. Developers tried to address these issues in various ways: pushed the tick rate to higher values, decoupled frame rates from tick rates, split game states to fit in multiple ticks, deviated from or changed their game engine state machines in order to have critical parts processed faster, with priority.  Therefore, tick rate is rarely equivalent with game state update rate or server snapshot rate nowadays, which is what players usually ask for, while, in certain games,  its increase would have an unnoticeable improvement, a dangerous impact on stability or simply be impossible (already at max supported by the clients or the server).

28 Replies