Back

How to host game servers

How to host game servers

Hosting a game server can be a delicate balancing act. Game servers are expected to deliver a high quality, low latency gaming experience to remote players over the internet. 

In this blog we discuss how to host game servers properly and the most important considerations when choosing game servers.

Game servers vs dedicated servers

It’s a common misconception that a ‘game server’ is a piece of physical server hardware. For example, you may hear people discussing the hardware specifications of their game server (CPU, RAM, etc.).  But what they’re really talking about are the internal components of the dedicated server used to run their game server software.

You see, the true meaning of a game server is actually the software layer installed onto a dedicated server. More specifically, it is an application instance which runs the server side of a multiplayer game. It’s the dedicated server that constitutes the physical hardware. This is the bare metal machine that usually sits in a hosted data center, ready for the installation of virtual machines or software.

Understanding this fundamental difference will help you choose the best game server hosting option.

game vs dedicated server

Ever noticed how some types of gaming are platform specific? Some games can only be played on a specific console or operating system. This is also true of online game servers that are optimized for a specific platform and ensures they are fast and efficient. 

However, there’s a fundamental problem with most game servers – their core code architecture is designed around a single, super-fast CPU core doing all the work. As ‘single-threaded’ applications, they require a very high frequency server processor to run the game properly without hampering performance.

To understand why this is such a big problem, we need to explore how CPU design has outpaced game server architecture and why it matters.

Game server CPUs – a brief history

For many years, server performance was boosted by simply increasing the frequency, or ‘clock speed’ of the CPU. By raising the CPU clock speed, chip manufacturers were able to significantly boost performance, from hertz (Hz) in the earliest computers to the gigahertz (Ghz) we now expect as standard.

But demand for speed outstripped the ability of manufacturers to make faster server CPUs as production numbers scale. They also faced a serious problem in terms of heat and energy use – ultra-fast processors use a lot of electricity and need extreme cooling solutions to maintain server stability. Simply increasing CPU clock speed was no longer a viable option.

The solution was to increase the number of cores inside each CPU, allowing more computation to take place simultaneously. By adding more cores, CPU clock speed becomes slightly less important in terms of server performance.

However, these multi-core CPUs only work effectively when applications have been properly re-architected for ‘multi-threaded’ operations. Many business applications have been re-coded to support multi-threading – but most game servers have not.

Which makes choosing the best server CPU for gaming that little bit harder.

What is the best game server CPU?

For many reasons (such as power requirements, server cooling provisions and hardware costs), most servers hosted in data centers today use multi-core CPUs, such as Intel Xeon Scalable Processor line, with relatively low clock speeds (typically 2.1Ghz to 2.8Ghz). These systems are usually used in virtualized infrastructure environments, allowing operators to pool CPUs, RAM and network connectivity to improve performance.

However, this setup is unsuitable for single-threaded, power-hungry games which tend to require a game server CPU running at 3.5Ghz or higher. To help solve this problem, Intel has developed the Xeon E-series of high-performance processors. These Intel Xeon server CPUs are capable of delivering burst speeds of up to 5.1Ghz across eight cores – making them one of the best server CPU lineups for gaming.

How can I get maximum value from my game server?

“But if game servers are engineered for single thread operations, aren’t I paying for another seven CPU cores that will never be used?”

The good news is that no, you’re not.

A single, bare-metal machine with an Intel Xeon E-series CPU is capable of running multiple game server instances. In theory, you could run an unlimited number of game servers on the same dedicated server to ensure you’re not paying for resources that go unused. 

This will of course depend on application efficiency, server RAM requirements, storage and the network configuration of the virtual and physical machines. You’ll also need to consider how many simultaneous players you hope to serve, and the size of the map or world used by the game.

A good game server hosting company will allow game studios to test hardware during their development phase to optimize hardware resource usage. This ensures studios get the maximum return on their investment – and that their players have the best experience too.

Why is server RAM important for gaming?

When hosting a game server, CPU tends to be the highest priority. However, server RAM is also important. It’s ultimately responsible for whether the dedicated machine will have sufficient memory to run the application optimally.

CPU and RAM tend to have a direct correlation. Hosted game servers typically maintain a ratio between the number of CPU cores and the RAM allocated to each. When renting a bare metal server with a high-power Intel Xeon E-series CPU, you’ll probably want to run more game server instances simultaneously, requiring more RAM in the process.

This means that if you double the capability of your CPU, you’ll probably need to double the server RAM too. But like we said, the only way to be sure is to work with your bare metal hosting provider to run test configurations to ensure you are getting the best performance from your gaming servers.

Related articles