iGaming platforms are growing, and online gambling options are diversifying. Fueled by increased legalization, access to mobile devices, and secure online payment gateways the industry has come a long way from the humble online poker game. Sports betting has quickly risen to prominence and now enjoys a 49% market share. All the while, new and emerging online gambling experiences continue to expand.
And whilst users’ experiences on iGaming platforms might be getting more seamless, the work that goes on behind the scenes is increasingly complex. Like a duck’s feet beneath the water, behind every iGaming application there’s a whole lot of paddling going on.
And most of that paddling takes place in a database. Where odds must be constantly updated and every transaction, wager, or bet stored, analyzed, and retrieved.
In this blog post we take a look behind the scenes to find out what databases are, why they’re critical to the iGaming industry, and the key infrastructure considerations platforms need to consider.
Simply put, a database is an organized store of information. Database software is the program used to manage, update, and extract this information. The software is hosted on a business’ underlying hardware, at which point the collective becomes a ‘database server’.
For iGaming platforms, the database is central to critical processes like financial transactions, query processing, user profiling, in-app messaging, logins, and managing connected devices. Online gambling platforms rely heavily on these databases for the delivery of seamless player experiences. For bets to be made, odds to be calculated, and transactions to be processed in real time, the database lies at the epicenter.
The first computerized databases emerged in the 1960s. These early databases took two forms: hierarchical database models (where data is organized in a parent-child structure supporting one-to-one and one-to-many relationships) and network database models (a more flexible iteration of the hierarchical model supporting many-to-many relationships).
By the 1970s, the relational database was born. This model would come to define the standardized principle for database systems and by the 1980s, relational databases saw mainstream commercial uptake with Structured Query Language (SQL) becoming the standard. A decade later in 1998 the first non-relational database, Carlo Strozzi’s ‘Strozzi NoSQL’ database, was born. Since then, database technology has continued to evolve, with emerging players in the NoSQL and accessible database space continuing to innovate.
There are various types of database applications that organizations can use, and different industries tend to gravitate towards different database technologies (or combinations of) depending on their process needs.
The main distinction lies between relational and non-relational databases but there is now also a third type, Distributed SQL (structured query language), combining attributes from both. For context, SQL is a programming language used for relational databases.
Relational databases, or SQL databases like MySQL organize data into tables (a bit like an Excel spreadsheet) with all data points related to each other. The rows in relational databases hold unique identifiers, each column holds the data attribute, and each record holds a value.
Relational databases provide high levels of consistency, serializability, and hold ACID (Atomicity, Consistency, Isolation, and Durability) credentials, but since they’re built as single-instance databases, can fall short when it comes to scalability and availability.
Relational databases provide a simple model for managing information and are most suited to dealing with data points that don’t change often – processing ecommerce transactions or tracking inventories, for instance.
Non-relational databases, or NoSQL databases, do not use a tabular model to store and retrieve data. In non-relational databases, data is stored in ‘document’ structures, enabling a wider range of information types to be stored in a variety of formats.
NoSQL is built for scalability, low latency, and availability, but struggles with the consistency and serializability of a traditional SQL database.
Since they’re generally faster and more flexible than their relational counterparts, non-relational databases are typically used for handling complex data.
The newest iteration of database software is distributed SQL (DistSQL), though at present the number of these applications is still limited.
Distributed SQL platforms like YugabyteDB, MariaDB, and CockroachDB are relational databases that combine features from both SQL and NoSQL frameworks allowing for a combination of SQL consistency and serializability and the scalability, low latency, and high availability of NoSQL.
In February, open-source distributed SQL database, Yugabyte, announced that it achieved ISO/IEX 27001 certification, validating the company’s compliance and commitment to data and database security.
Off the back of the announcement, Yugabyte’s Director of Information Security and Compliance commented that the company remains “committed to delivering the necessary features and processes to ensure YugabyteDB can safely manage large amounts of critical data in production environments”. Looking to the future, as distributed SQL gains reputability, it could become another viable option for the iGaming industry.
Database requirements are largely dependent on individual user cases. For example, what a gaming platform needs from its database (the ability to store player data, game states, and environments) will be very different from a bank (which needs a fast ledger to facilitate simultaneous transactions). Each platform, and each application, will have different needs.
Even within the iGaming industry itself, different processes can call for different database models. For example, what’s needed to facilitate financial transactions is different from what’s needed for analytical processing.
When we think of industries that rely heavily on online transaction processing (OLTP), the financial sector (trading in particular) is typically the first that comes to mind. But today’s fast-paced iGaming industry is under just as much pressure to deliver seamless payments. For that reason, the OLTP part of database software is incredibly important.
Within the OLTP part of the software “SQL technology is mostly used”, according to Masis Yepremyan, Head of IT Operations Unit at BetConstruct. SQL is the most common database model used for OLTP database clusters. That’s largely because SQL can run queries against data incredibly quickly to facilitate huge volumes of simultaneous transactions in real-time.
The analytical processing part of the database (OLAP) is responsible for analyzing aggregated historical data from the OLTP part using complex queries. The OLAP analyzes vast volumes of data at high speeds. Broadly speaking, NoSQL databases are designed to provide high read and write throughput on large objects and are not intended for use with atomic level data. Because of this, SQL is generally the better choice. However, there are some exceptions to this rule and some platforms based on NoSQL, such as MongoDB and Apache HBase, can support analytical processing.
One of the biggest challenges when choosing databases is finding a way to separate OLTP and OLAP database parts at the application level and handling the cleanup of old and outdated data within the OLTP part.
Such is the nature of real-time, real-money games, users of iGaming platforms expect nothing short of seamless performance. And since the database stores all the critical data essential for delivering these experiences, it’s important that the underlying infrastructure hosting that database doesn’t let you down.
The better your infrastructure performs, the better your database will perform. With that in mind, these are the major infrastructure considerations that the iGaming industry needs to consider.
The central processing unit (CPU) is, for all intents and purposes, the brain of a computer. It’s responsible for processing inputs, storing data, and outputting results. The more cores the more your CPU can multitask and compute multiple clusters of data at any single point in time.
It’s important to consider the number of cores in the server, and the frequency of each core to find the right balance between scale and performance. Typically, the higher the number of cores, the lower the performance of each. Single socket CPUs, for example, tend to offer the highest frequency cores.
The maximum RAM of a server is generally linked to theCPU meaning that the maximum capacity of a server is, ultimately, defined by the CPU. Your infrastructure provider should offer a flexible amount of RAM that can be adjusted up and down up to the physical limitations of the server. And the same goes for storage.
The type of disks used can have a huge impact on speed and performance. The difference between hard disk drive (HDD), solid state drive (SSD), and nonvolatile memory express (NVMe) can be huge. There’s already a massive leap in performance from hard disk to SSD level and an even bigger step up when it comes to NVMe.
To put this into numbers, an HDD can provide a read/write speed of up to 150 mbps. This increases to 500 mbps for SSDs and 3500 mbps for NVMes. A server that has NVMe drives will, therefore, process information significantly faster meaning that the database installed on it will run phenomenally quickly.
Industries like iGaming that require lightning-fast processing speeds, will benefit from choosing an infrastructure provider offering the flexibility to choose custom disk specifications including options for HDD, SSD, and NVMe disks. For the most part, the best fit for the industry involves low latency and enterprise grade SSD storage.
There’s one thing that always remains true whether talking about a database server, web server, or application server. Low network capacity will restrict the information pipelines within them. So, it’s essential to choose infrastructure that supports fast download, upload, and port speeds.
Downtime equates to losses for iGaming platforms and their users so prioritizing high-availability infrastructure with a good level of redundancy is a must. After all, if your server goes down, so does your database. The stakes are, quite literally, high. Lag, delays, and outages are all the more damaging in this industry, as we observed during the 2023 Super Bowl when William Hill U.S. experienced a “devastating systems outage during the second quarter”.
Even within the parameters of the iGaming industry the variation in infrastructure specifications required can be huge. It’s important to choose an infrastructure provider who will work with you to determine and execute exactly what your database application needs from its underlying infrastructure to work optimally. To achieve this, a good vendor will provide you with a solutions architect to assess your database, processor, and network and propose an optimized solution.
Behind every bet, wager, and transaction lies a database. And in the iGaming industry, where odds are constantly changing and players’ real money is on the line, choosing the right database technology on the right infrastructure is critical to success.
Given the massive variety in online gambling experiences offered by iGaming platforms, from online casinos to slots to live-betting and, more recently, micro-betting (betting on individual moments in a game), there isn’t a one-size-fits-all solution.
As the iGaming industry continues to grow, hosting your database on optimized infrastructure that balances performance, availability, and customization will become even more important.
To find out more about our custom iGaming hosting solutions visit our industry page. If you’d like to chat about this blog, iGaming, or my extensive hat collection, I’d love to hear from you.
Jamie Daniel is our iGaming expert and understands the unique challenges faced by the industry. When not cultivating a magnificent beard, he’s working from our UK office.