Streaming has become a mainstay of modern-day life. Over-the-top (OTT) streaming services and cloud communications are omnipresent in our daily lives. So much so that hybrid revenues for AVOD and SVOD streaming platforms are projected to reach $20 billion by 2029.
Mobile device penetration globally is driving demand for streaming services like never before. But to understand the streaming industry today, we’ve got to look back to the 1990s and the development of the very first streaming protocol – RTMP.
In this blog post we’ll take a whistle-stop tour through the history of streaming protocols to better understand the foundational technologies holding up the streaming industry as we know it today.
A streaming protocol is a set of rules dictating the way data gets communicated between devices over the internet. In simple terms, streaming protocols segment data into small chunks so that files can be transmitted over the internet more easily.
Programs called codecs are used to compress and decompress media files enabling content to be delivered and played back to the end-user.
Protocols have evolved significantly over the last couple of decades, but the very first technologies emerged in the 1990s. And that’s where our chronology begins.
The 90s was a decade of many streaming firsts. In 1993, a band called Severe Tire Damage executed the first successful live stream. In 1995, RealNetworks created RealPlayer - the first media player capable of live streaming. And in 1999, George Washington University held the world’s first ever presidential webcast.
But most importantly, the 90s saw the release of two formative protocols: Real Time Messaging Protocol (RTMP) and Real Time Streaming Protocol (RTSP).
Macromedia developed RTMP as a Transmission Control Protocol (TCP) in 1996. The objective was to enable live streaming over the internet by connecting the original Flash Player to dedicated media servers. Files were sent in Small Web Format (SWF) to deliver video, audio, and text data across devices. This game changing development meant that data could be sent from the streaming servers to video player in real-time.
Shortly after the release of RTMP, a first draft for another protocol was developed through a partnership between Netscape, RealNetworks, and Columbia University. RTSP was published in 1998, described by Netscape as “an application-level protocol for control over the delivery of data with real-time properties”. The protocol was innovative because it enabled users to play, pause, and rewind media streams – just like using VCR.
It was in the 2000s that some of the biggest hitters in the streaming industry today were born. In 2005, Macromedia was acquired by Adobe and Adobe Flash Player entered the scene. That same year YouTube was launched, shortly followed by justin.tv (soon to become Twitch) in 2006, and the holy grail of streaming services, Netflix, in 2007.
But the decade also marked an important crossroads in streaming protocol development after Steve Jobs announced that Apple would cease to support Flash Player in 2010. Unsurprisingly, Apple had been working on its own proprietary format – HTTP Live Streaming (HLS).
First released in 2009, HLS was an HTTP-based streaming protocol. HLS works by breaking down media data into chunks, compressing these chunks, and transmitting the data over HTTP to a viewer’s device. Initially the streaming protocol was only supported by iOS, but it has since become a widely used protocol, supported by many different browsers and devices.
The streaming protocol wars had begun and, that same year, Adobe was also working on its own HTTP-based protocol – HTTP Dynamic Streaming (HDS).
Adobe’s HDS protocol was developed as an adaptive bitrate streaming (ABS) protocol, delivering MP4 content over HTTP to facilitate on-demand streaming. HDS was designed for use with Adobe products only and was not supported by Apple. Its proprietary nature, combined with the eventual discontinuation of Flash Player, meant that HDS was never widely adopted and eventually fell out of mainstream use.
Microsoft was also riding the HTTP-based train and in 2008, whilst the Apple-Adobe feud was raging on, Microsoft announced the release of a new HTTP-based adaptive streaming extension – Microsoft Smooth Streaming (MSS).
MSS offered a lot of unique features. Most importantly, the streaming protocol could detect bandwidth conditions, CPU utilization, and playback window resolution to target the delivery of HD content. And that means that MSS was the perfect technology to trial at major sporting events such as the 2008 Beijing Olympics and 2009 Wimbledon Championship.
It was in the noughties that some of the most seminal streaming giants first emerged, but it was in the 2010s that live-streaming claimed it’s true moment of glory. Throughout the decade video consumption grew exponentially. And with public appetite for on-demand streaming services surging, the chaotic web of technologies was crying out for simplification.
In 2010, a group of streaming and media companies including Netflix, Google, Microsoft, and Adobe united to form an industry streaming standard – Dynamic Adaptive Streaming over HTTP (MPEG-DASH). The way MPEG-DASH works is very similar to HLS. Data is segmented into chunks and decoded for playback. The key difference is that, unlike HLS, MPEG-DASH was developed as an open-source, codec-agnostic standard.
MPEG-DASH marked a major shift in the direction of standardization, but it certainly didn’t bring further innovation to a halt. After acquiring Global IP Solutions in 2010, Google developed its own open-source project – Web Real-Time Communications (WebRTC).
WebRTC was designed to enable real-time, peer-to-peer communication between browsers. The open-source, plugin-free framework allowed developers to facilitate real-time communication within browser-based software without intermediary servers. It has since become one of the most widely used streaming protocols for online communications and by 2027, the global WebRTC market is expected to reach $46,336.6 million.
By 2012 another streaming protocol had emerged. Secure Reliable Transport (SRT) was developed by Haivision and first demoed at the international Broadcasting Convention in 2013. This open-source video transport protocol was designed to deliver low latency, optimized streams over unpredictable networks. By 2017, SRT was released as on open-source protocol on GitHub and Haivision teamed up with Wowza to create the SRT alliance.
At the close of the decade, in 2019, a new type of HTTP-based protocol emerged. High Efficiency Streaming Protocol (HESP) was first introduced by THEO Technologies at the 2019 National Association of Broadcasters.
The adaptive HTTP-based protocol replaced the standard segmentation-based streaming approach with an innovative frame-based approach. The objective was to maintain lower latency and better viewer experiences at scale over standard HTTP infrastructure and content delivery networks (CDNs). By 2020, HESP partnered with Synamedia to form an organization dedicated to accelerating the adoption and wider standardization of the HESP streaming protocol - the HESP Alliance.
Over the last couple of years, streaming services for entertainment and communications have gone from novelty to necessity. And now an even wider range of products have started to emerge. Products like live and on-demand fitness, live e-commerce, video game streaming, and even telehealth.
In 2022, the worldwide virtual fitness market was valued at $16.4 billion. In China, live commerce is forecast to outperform traditional commerce by the end of 2023. And 40% of video development professionals believe that streaming will show the most growth out of any gaming platform by 2025.
We might only be three years into the decade, but streaming protocol development has continued at a healthy pace. In December 2022, YoloLiv released the Ardent Streaming Protocol (ASP). Though still in beta, the smart streaming protocol works by storing and transmitting larger volumes of redundant data. In so doing, ASP reduces quality loss even under unstable internet conditions.
Another recent development includes Adaptive Reliable Transport (ART). A joint venture by Teradek and Amimon, the streaming protocol was launched in 2022 and supports both real-time and bi-directional streaming between two ART systems. Something that Michael Gailing, General Manager of Live Production at Teradek, believes makes ART “the most impressive video transport protocol available today”.
There’s also Media Relay Version 2 (MRV2). Developed by Triton Digital, this proprietary streaming transport protocol was designed for use between Triton Digital’s master control room and CDN to expedite and improve server-side ad insertion.
The streaming industry as we know it today has been heavily influenced by protocol development over the last thirty years.
Now, as demand for seamless content delivery mounts further, the pressure is on to develop streaming protocols capable of latency at the lowest scales. Low-latency protocols (Low Latency HLS and Low Latency DASH, for instance) have already hit the market. And last year TheoPlayer announced the release of HESP.live - a new HTTP-based real-time video streaming infrastructure capable of sub-second latency at scale.
But the landscape is still highly saturated. And as Erik Otto, Chief Executive of Compliance at Mediaproxy, writes, “the inherent challenge in this for broadcasters and service providers is to cope with many variations of current streaming protocols and technical standards”.
To learn even more about the history of streaming protocols, including additional industry insights read our whitepaper.