Network
Streaming has become a mainstay of modern-day life. Over-the-top (OTT) streaming services and cloud communications are omnipresent in our daily lives. So much so that, by 2030, AVOD and SVOD market values are projected to reach $63.1 billion and $237.4 billion, respectively.
Mobile device penetration globally is driving demand for streaming services like never before. But to understand the streaming industry today, we’ve got to look back to the 1990s and the development of the very first streaming protocol – RTMP.
In this blog post we’ll take a whistle-stop tour through the history of streaming protocols to better understand the foundational technologies holding up the streaming industry as we know it today.
A streaming protocol is a set of rules dictating the way data gets communicated between devices over the internet. In simple terms, streaming protocols segment data into small chunks so that files can be transmitted over the internet more easily.
Programs called codecs are used to compress and decompress media files enabling content to be delivered and played back to the end-user.
Protocols have evolved significantly over the last couple of decades, but the very first technologies emerged in the 1990s. And that’s where our chronology begins.
| 1996 | 2008 | 2009 | 2011 | 2012 | 2017 | 2019 | 2022 |
|---|---|---|---|---|---|---|---|
| RTMP RTSP |
MSS | HLS HDS |
WebRTC | MPEG-DASH | SRT | HESP | ASP ART |
The 90s was a decade of many streaming firsts. In 1993, a band called Severe Tire Damage executed the first successful live stream. In 1995, RealNetworks created RealPlayer - the first media player capable of live streaming. And in 1999, George Washington University held the world’s first ever presidential webcast.
But most importantly, the 90s saw the release of two formative protocols: Real Time Messaging Protocol (RTMP) and Real Time Streaming Protocol (RTSP).
Macromedia developed RTMP as a Transmission Control Protocol (TCP) in 1996. The objective was to enable live streaming over the internet by connecting the original Flash Player to dedicated media servers. Files were sent in Small Web Format (SWF) to deliver video, audio, and text data across devices. This game changing development meant that data could be sent from the streaming servers to video players in real-time.
Shortly after the release of RTMP, a first draft for another protocol was developed through a partnership between Netscape, RealNetworks, and Columbia University. RTSP was published in 1998, described by Netscape as “an application-level protocol for control over the delivery of data with real-time properties”. The protocol was innovative because it enabled users to play, pause, and rewind media streams – just like using VCR.
It was in the 2000s that some of the biggest hitters in the streaming industry today were born. In 2005, Macromedia was acquired by Adobe and Adobe Flash Player entered the scene. That same year YouTube was launched, shortly followed by justin.tv (soon to become Twitch) in 2006, and the holy grail of streaming services, Netflix, in 2007.
But the decade also marked an important crossroads in streaming protocol development after Steve Jobs announced that Apple would cease to support Flash Player in 2010. Unsurprisingly, Apple had been working on its own proprietary format – HTTP Live Streaming (HLS).
First released in 2009, HLS was an HTTP-based streaming protocol. HLS works by breaking down media data into chunks, compressing these chunks, and transmitting the data over HTTP to a viewer’s device. Initially the streaming protocol was only supported by iOS, but it has since become a widely used protocol, supported by many different browsers and devices.
The streaming protocol wars had begun and, that same year, Adobe was also working on its own HTTP-based protocol – HTTP Dynamic Streaming (HDS).
Adobe’s HDS protocol was developed as an adaptive bitrate streaming (ABS) protocol, delivering MP4 content over HTTP to facilitate on-demand streaming. HDS was designed for use with Adobe products only and was not supported by Apple. Its proprietary nature, combined with the eventual discontinuation of Flash Player, meant that HDS was never widely adopted and eventually fell out of mainstream use.
Microsoft was also riding the HTTP-based train. In 2008, whilst the Apple-Adobe feud was raging on, Microsoft announced the release of a new HTTP-based adaptive streaming extension – Microsoft Smooth Streaming (MSS).
MSS offered a lot of unique features. Most importantly, the streaming protocol could detect bandwidth conditions, CPU utilization, and playback window resolution to target the delivery of HD content. And that meant that MSS was the perfect technology to trial at major sporting events such as the 2008 Beijing Olympics and 2009 Wimbledon Championship.
It was in the noughties that some of the most seminal streaming giants first emerged, but it was in the 2010s that live-streaming claimed it’s true moment of glory. Throughout the decade video consumption grew exponentially. And with public appetite for on-demand streaming services surging, the chaotic web of technologies was crying out for simplification.
In 2010, a group of streaming and media companies including Netflix, Google, Microsoft, and Adobe united to form an industry streaming standard – Dynamic Adaptive Streaming over HTTP (MPEG-DASH). The way MPEG-DASH works is very similar to HLS. Data is segmented into chunks and decoded for playback. The key difference is that, unlike HLS, MPEG-DASH was developed as an open-source, codec-agnostic standard.
MPEG-DASH marked a major shift in the direction of standardization, but it certainly didn’t bring further innovation to a halt. After acquiring Global IP Solutions in 2010, Google developed its own open-source project – Web Real-Time Communications (WebRTC).
WebRTC was designed to enable real-time, peer-to-peer communication between browsers. The open-source, plugin-free framework allowed developers to facilitate real-time communication within browser-based software without intermediary servers. It has since become one of the most widely used streaming protocols for online communications with a market value of $9.56 billion in 2025 that is expected to reach $94.07 billion by 2032.
By 2012 another streaming protocol had emerged. Secure Reliable Transport (SRT) was developed by Haivision and first demoed at the International Broadcasting Convention in 2013. This open-source video transport protocol was designed to deliver low latency, optimized streams over unpredictable networks. By 2017, SRT was released as on open-source protocol on GitHub and Haivision teamed up with Wowza to create the SRT alliance.
At the close of the decade, in 2019, a new type of HTTP-based protocol emerged. High Efficiency Streaming Protocol (HESP) was first introduced by THEO Technologies at the 2019 National Association of Broadcasters.
The adaptive HTTP-based protocol replaced the standard segmentation-based streaming approach with an innovative frame-based approach. The objective was to maintain lower latency and better viewer experiences at scale over standard HTTP infrastructure and content delivery networks (CDNs). By 2020, HESP partnered with Synamedia to form an organization dedicated to accelerating the adoption and wider standardization of the HESP streaming protocol - the HESP Alliance. Following the acquisition of THEO Technologies by Dolby in 2024, the alliance has since been disbanded but the HESP protocol remains in use.
Over the last couple of years, streaming services for entertainment and communications have gone from novelty to necessity. And now an even wider range of products have started to emerge. Products like live and on-demand fitness, live e-commerce, video game streaming, and even telehealth.
The worldwide virtual fitness market is projected to reach a value of $106.4 billion by 2030. In China, live commerce already outperforms traditional commerce and is set to surpass the one trillion dollar valuation mark by the end of 2026 . And, fueled by increased adoption of video streaming apps of mobile devices, game streaming platforms like Twitch are going strong (with billions of hours watched on the platform in 2025 alone).
Since the start of the decade, streaming protocol advancements have continued at a healthy pace, with the development of the following new protocols:
Ardent Streaming Protocol (ASP) | 2022: Released as a proprietary technology by YoloLiv, ASP is a smart streaming protocol that works by storing and transmitting larger volumes of redundant data. In so doing, ASP reduces quality loss even under unstable internet conditions.
Adaptive Reliable Transport (ART) |2022: A joint venture by Teradek and Amimon, ART supports both real-time and bi-directional streaming between two ART systems.
Media Relay Version 2 (MRV2) | 2022: Developed by Triton Digital, this proprietary streaming transport protocol was designed for use between Triton Digital’s master control room and CDN to expedite and improve server-side ad insertion.
High Efficiency Streaming Protocol Live (HESP.live) |2022: released by THEO Technologies (now Dolby OptiView), HESP.live uses the HESP protocol to enable live interactivity. It was the first HTTP-based real-time video streaming infrastructure capable of achieving sub-second latency at scale.
WebRTC-HTTP Ingestion Protocol (WHIP) | 2025: Developed in collaboration with Millicast (now part of Dolby laboratories), WHIP is an open IETF-defined standard. A modern alternative to older ingestion protocols like RTMP, WHIP simplifies how live media streams are ingested into streaming platforms using WebRTC.
Arguably the most impactful protocol of recent years, however, is Media over QUIC (MoQ). MoQ is a next-generation streaming protocol that leverages modern technologies including QUIC, WebTransport and HTTP/3 to overcome the challenges associated with traditional streaming protocols based on TCP. Using QUIC’s UDP-based multiplexing, low-latency connection setup, and congestion control, MoQ delivers media efficiently and achieves sub-second latency (even across unpredictable network conditions). With global audiences expecting increasingly seamless streaming experiences, MoQ marks a significant milestone for developers and streaming platforms.
The protocol is still being developed as an open standard by the Internet Engineering Task Force (IETF) and a final standard is expected to be released in 2026. In 2025, Cloudflare launched the first global MoQ relay network on its existing infrastructure, marking an important milestone toward practical adoption. The same year, real-time video streaming platform, nanocosmos, unveiled an upgraded global streaming infrastructure based on MoQ, marking the first MoQ-based real-time streaming solution on a CDN foundation that is ready for worldwide deployment.
The streaming industry as we know it today has been heavily influenced by protocol development over the decades.
Now, as demand for seamless content delivery mounts, the industry continues to strive towards higher levels of video quality. Latency reduction remains a key priority in 2026 and MoQ is currently the leading player - redefining the industry standard for ultra-low-latency streaming at scale.
Today, the influence of AI is also at play, with AI-powered algorithms being used to optimize content delivery with predictive analytics. New challenges are also present. Tariffs introduced by the Trump administration targeting hardware and networking equipment, risk impairing wider adoption of cutting-edge codecs, protocols and low-latency standards. But despite these unknowns, one thing is clear: streaming protocols have come a long way since the 90s and innovation isn’t slowing down just yet.
To learn even more about the history of streaming protocols, including additional industry insights, read our whitepaper.

Frances is proficient in taking complex information and turning it into engaging, digestible content that readers can enjoy. Whether it's a detailed report or a point-of-view piece, she loves using language to inform, entertain and provide value to readers.