Hosting high performance game servers can be a delicate balancing act. Game servers are expected to deliver a high-quality, low-latency gaming experience to remote players over the internet.
So, how do you host game servers properly?
First, let’s take a quick look at the differences between a game server and a dedicated server.
You may have heard people discussing the hardware specifications of their games server - but, when it comes to the game server CPU or server RAM for gaming, they are really talking about the internal components of their dedicated server which is used to run their games.
Contrary to popular belief, the meaning of a game server is actually the software layer installed onto a dedicated server. More specifically, it is an application instance which runs the server side of a multiplayer game.
A dedicated server describes the physical hardware. This is the bare metal machine that usually sits in a hosted data center, ready for the installation of virtual machines or software.
Games server = software
Dedicated server = hardware
Understanding this fundamental difference will help you better choose the game server hosting option which works best for you.
Ever noticed how some games are platform specific? They can only be played on a specific console or operating system? (It’s a topic that some hardcore gamers can get pretty worked up about.) Well, this is also true of online games servers that are optimized for a specific platform—it ensures they are fast and efficient.
However, there’s a fundamental problem with most game servers – their core code architecture is designed around a single, super-fast CPU core doing all the work. As ‘single-threaded’ applications, they require a very high frequency server processor to run the game properly without hampering performance.
To understand why this is such a big problem, we’ll explore how CPU design has outpaced games server architecture and why it matters.
For many years, server performance was boosted by simply increasing the frequency, or ‘clock speed’ of the CPU. By raising the CPU clock speed, chip manufacturers were able to significantly boost performance, from hertz (Hz) in the earliest computers to the gigahertz (Ghz) we now expect as standard.
But demand for speed outstripped the ability of manufacturers to make faster server CPUs as production numbers scale. They also faced a serious problem in terms of heat and energy use – ultra-fast processors use a lot of electricity and need extreme cooling solutions to maintain server stability. Simply increasing CPU clock speed was no longer a viable option.
The solution was to increase the number of cores inside each CPU, allowing more computation to take place simultaneously. By adding more cores, CPU clock speed becomes slightly less important in terms of server performance.
However, these multi-core CPUs only work effectively when applications have been properly re-architected for ‘multi-threaded’ operations. Many business applications have been re-coded to support multi-threading – but most game servers have not.
Which makes choosing the best server CPU for gaming that little bit harder.
For many reasons (such as power requirements, server cooling provisions, hardware costs), most servers hosted in data centers now use multi-core CPUs, such as Intel Xeon Scalable Processor line, with relatively low clock speeds, typically 2.1Ghz to 2.8Ghz. These systems are typically used in virtualized infrastructure, allowing operators to pool CPUs, RAM and network connectivity to improve performance.
Clearly this is unsuitable for single-threaded, power-hungry games which tend to require a game server CPU running at 3.5Ghz or higher.
To help solve this problem, Intel has developed the Xeon E-series of high performance processors. These Intel Xeon server CPUs are capable of delivering burst speeds of up to 5.1Ghz across eight cores – making them one of the best server CPU lineups for gaming.
“But if game servers are engineered for single thread operations, aren’t I paying for another seven CPU cores that will never be used?”
The good news is that no, you’re not.
A single, bare-metal machine with an Intel Xeon E-series CPU is capable of running multiple instances of a games server. In theory, you could run an unlimited amount of game servers on the same dedicated server to ensure you’re not paying for resources that go unused.
This will of course depend on application efficiency, server RAM for gaming requirements, storage and network configuration of the virtual and physical machine. You’ll also need to consider how many simultaneous players you hope to serve and the size of the map or world used by the game.
A good game server hosting company will allow game studios to test hardware during their development phase to optimize hardware resource usage. This ensures studios get the maximum return on their hosted games server investment – and that their players have the best experience too. (You can talk to our team about development phase hardware testing here.)
When choosing a server for gaming, CPU tends to be the highest priority. However, server RAM for gaming is also important – will the dedicated machine have sufficient memory to run the application adequately?
CPU and RAM tend to have a direct correlation – hosted games servers typically maintain a ratio between the number of CPU cores and the RAM allocated to each. When renting a bare-metal server with a high-power Intel Xeon E-series CPU, you’ll probably want to run more game server instances simultaneously, requiring more RAM in the process.
This means that if you double the capability of your CPU, you’ll probably need to double server RAM for gaming too. But like we said, the only way to be sure is to test configurations to ensure you are getting the best performance for your investment.
If you’re weighing up your game server hosting options, researching the best server CPU for gaming, or want to arrange a test for your own games server application, please get in touch with us.