Back
Back

The history of streaming told through protocols

The history of streaming told through protocols

A whitepaper exploring how the first protocol paved the way for a multi-billion-dollar industry

Streaming is the lifeblood of modern-day entertainment and communications. But the permeation of streaming is a relatively recent phenomenon. To understand where the streaming industry lies today, we must look back to the 90s and the development of the first streaming protocols. 

In this whitepaper we’ll walk through a chronology of the streaming protocols that have paved the way for this multi-billion-dollar industry. We discuss the development of each protocol and the latest developments fueling the streaming industry today.

Streaming industry overview

The global video-on-demand (VoD) market is forecast to grow from $230.6 billion in 2026 to $855.9 billion in 2035, at a compound annual growth rate of 15.7%.

One of the largest segments in streaming is Subscription-Video-Streaming-on-Demand (SVOD). The SVOD market is projected to achieve a market volume of $123.68 billion and attract 1.78 billion users by the end of 2030.

Likewise, research points towards a surge in ad-supported-video-on-demand (AVOD) revenues in the coming years, with the market expected to reach $226.573 million by 2030. The global live streaming market is also growing rapidly, forecast to reach a value of $345.13 billion by 2030.

Low barriers to entry, ease of accessibility, and mobile device penetration globally are driving demand for streaming services like never before. Testament to the industry’s success, prominent players in the streaming industry (think Netflix, YouTube, Twitch, and communications platforms like Skype and Zoom) quickly became household names.

Now, the market is becoming increasingly monopolized. In December 2025, Netflix announced its planned acquisition of Warner Bros., including film and television studios HBO and HBO Max. With a deal valuation of $82.7 billion, this is a massive consolidation of streaming media power, embedding industry hierarchies and signalling that highly scaled content libraries and production capacities matter more than ever.

But none of this would have been possible without the foundational frameworks that allow devices to communicate over the internet: streaming protocols.

What are streaming protocols?

Streaming protocols define the standards and methods for transmitting multimedia content over the internet from one device to another. They function as regulatory blueprints, standardizing the rules for how media files are encoded, transmitted, and decoded, ensuring a seamless delivery and playback experience for the end user.

For example, most video and audio streaming protocols segment data into small chunks that can be transmitted more easily. This process relies on codecs (programs that compress or decompress media files) and is how content is delivered and played back to end-users.

There are many different streaming protocols each with unique strengths and weaknesses. This whitepaper explores each of the major streaming protocols chronologically, piecing together a comprehensive history of this industry's impressive evolution over time.

It will cover the following streaming protocols:

  • Real Time Messaging Protocol (RTMP)

  • Real Time Streaming Protocol (RTSP)

  • HTTP Live Streaming (HLS)

  • HTTP Dynamic Streaming (HDS)

  • Microsoft Smooth Streaming (MSS)

  • Dynamic Adaptive Streaming over HTTP (MPEG-DASH)

  • Web Real-Time Communications (WebRTC)

  • Secure Reliable Transport (SRT)

  • High Efficiency Streaming Protocol (HESP)

  • WebRTC-HTTP Ingestion Protocol (WHIP)

  • Media over QUIC (MoQ)

Major streaming protocols by release date

The 90s - RTMP and RTSP

It was during the 90s that the key technologies that would change the trajectory of the streaming industry first emerged. This was a decade of many firsts. The first live video stream, the first live streaming media player, and even the first presidential webcast.

The first live stream
1993
The first live streaming media player
1995
The first presidential webcast
1999
Today, live streaming is part and parcel of everyday life. But it wasn’t until 1993 that the first live video and audio stream was successfully executed. It all started with some computer scientists-come-musicians. On the 24th of June 1993 their band Severe Tire Damage successfully streamed one of their gigs over a network called multicast backbone (Mbone). In 1995 RealNetworks (formerly Progressive Networks) created RealPlayer. This was the first media player capable of live streaming and played host to the very first public live stream/ webcast - a baseball game between the New York Yankees and the Seattle Mariners. By 1997, the company commercialized live video streaming with the launch of their RealVideo program. On November 8th 1999, George Washington University held the world’s first ever presidential webcast. Bill Clinton participated in the event entitled “Third Way Politics in the Information Age”, which set a precedent for the popular uptake of live streaming.

But the most notable development of the decade was the release of two seminal streaming protocols: Real Time Messaging Protocol (RTMP) and Real Time Streaming Protocol (RTSP).

Real Time Messaging Protocol (RTMP)

In 1996, Macromedia (the multimedia software company that would later be acquired by Adobe) developed RTMP. The protocol was developed as a Transmission Control Protocol (TCP) technology to connect the original Flash Player to dedicated media servers and facilitate live streaming over the internet.

How does RTMP work?

How does RTMP work?

RTMP sent files in an Adobe flash format called a Small Web Format (SWF) and could deliver audio, video, and text data across devices. The ability to transmit data from a server to a video player in real-time was game changing.

After the discontinuation of Flash in 2020, RTMP fell out of favor for general use. But the protocol is still used due to some notable advantages including low latency and reduced overheads. These days the protocol is used principally for delivering encoded content to media servers, streaming platforms, and social media platforms.

Real Time Streaming Protocol (RTSP)

Around the same time that Macromedia was developing RTMP, a first draft for another protocol was being submitted. RTSP was developed under a partnership between RealNetworks, Netscape, and Columbia University, offering something that was truly innovative at the time; VCR-like capabilities enabling users to control the viewing experience with the ability to play, pause, and rewind media streams.

Netscape’s official 1998 Standards Track memo for RTSP defines the protocol as “an application-level protocol for control over the delivery of data with real-time properties”.

The memo continues:

“RTSP provides an extensible framework to enable controlled, on-demand delivery of real-time data, such as audio and video. Sources of data can include both live data feeds and stored clips. This protocol is intended to control multiple data delivery sessions, provide a means for choosing delivery channels such as UDP, multicast UDP and TCP, and provide a means for choosing delivery mechanisms based upon RTP”.

How does RTSP work?

How does RTSP work?

After being published in 1998, RTSP quickly became the leading protocol for audio and video streaming. And though it has since been eclipsed by HTTP-based technologies, it is still widely adopted across use cases including IP cameras, robotics, and IoT devices due to its simple design, compatibility across devices, and low latency.

RTMP and RTSP were leading protocols of their time but have been largely replaced by HTTP-based and adaptive bitrate streaming technologies which are easier to scale to the demands of large broadcasts.

What is HTTP?

HTTP stands for Hyper Text Transfer Protocol. HTTP is a method for encoding and transferring information between a client and a web server and is the principal protocol for the transmission of information over the internet.

HTTP streaming involves push-style data transfer where a web server continuously sends data to a client over an open HTTP connection.

HTTP 1.0 was first introduced in 1996, with subsequent updates HTTP 1.1, 2, and 3 brought out in 1997, 2015, and 2022, respectively.

HTTP/3 remains the most recent update with no HTTP/4 on the horizon in 2026. This is, in large, due to the parallel development of several new, advanced technologies including Media over QUIC (MoQ).

What is adaptive bitrate streaming?

First created in 2002, adaptive bitrate streaming (ABS) is a technology for improving HTTP streaming.

ABS dynamically adjusts the compression level and video quality of streams to match bandwidth availability so that streams can be delivered efficiently and at the highest usable quality.

The 00s - HLS, HDS, and MSS

At the turn of the millennium live streaming and VoD started gaining even more traction. This was the decade that some of the biggest names in the industry were born. YouTube hit the scene in 2005, followed by Twitch (then justin.tv) in 2006, and Netflix in 2007. As access to the internet and mobile technology widened, internet video streaming hit the masses.

In 2005, Adobe acquired Macromedia and Adobe Flash fast became a mainstay of video streaming. But in 2010, three years after the iconic release of the first iPhone, Steve Jobs announced that Apple would cease to support the platform.

In his open letter, Thoughts on Flash, Jobs writes:

“I wanted to jot down some of our thoughts on Adobe’s Flash products so that customers and critics may better understand why we do not allow Flash on iPhones, iPods and iPads. Adobe has characterized our decision as being primarily business driven – they say we want to protect our App Store – but in reality it is based on technology issues. Adobe claims that we are a closed system, and that Flash is open, but in fact the opposite is true.”

Needless to say, Apple had been working on its own proprietary format - HLS.

HTTP Live Streaming (HLS)

HLS is an adaptive HTTP-based streaming protocol that was first released in 2009. HLS breaks video and audio data into chunks which are compressed and transmitted over HTTP to the viewer’s device. The protocol was initially only supported by iOS but has since become widely supported across different browsers and devices. When Adobe Flash was phased out in 2020, HLS became the streaming protocol of choice.

How does HLS work?

How does HLS work?

Industry spotlight

HLS is still going strong today, with new platforms adopting the protocol. In December 2025, Zeno Media (a streaming and podcasting company that connects diaspora communities with content from their home countries), expanded its offerings using the HLS protocol to better enable flexible streaming and on-demand delivery.

HTTP Dynamic Streaming (HDS)

Adobe was also working on its own HTTP-based protocol. HDS, also developed in 2009, is a type of adaptive bitrate streaming protocol.

How does HDS work?

How does HDS work?

HDS delivers MP4 content over HTTP for live or on-demand streaming. The protocol was developed for use with Adobe products like Flash Player and Air and is not supported by Apple devices. The adoption of the HDS protocol has been less widespread than HLS and, after Flash was discontinued and the number of people with HDS compatible devices declined, the protocol fell out of mainstream use.

Microsoft Smooth Streaming (MSS)

While the Apple-Adobe debacle raged, another protocol was entering the fray. In October of 2008, Microsoft announced that a new HTTP-based adaptive streaming extension would be added to its Internet Information Server (ISS). MSS facilitated the targeted delivery of HD content by detecting bandwidth conditions, CPU utilization, and playback window resolution.

Microsoft put its new streaming technology to the test at a series of major sporting events including the 2008 Beijing Olympics, 2009 Wimbledon Championship, and 2010 Vancouver Winter Olympics, during which NBC sports used the technology to deliver streams to 15.8 million users - a testament to the merits of HTTP streaming for large-scale live event streaming.

How does MSS work?

The 2010s - MPEG-DASH, SRT, HESP, and WebRTC

It was in the 2010s that live-streaming finally had its moment of glory. Video consumption on mobile devices was up and the appetite for live video started to grow exponentially.

In 2011 Twitch as we know it today was founded. In October 2012, a YouTube live of Felix Baumgartner’s skydive amounted 8 million simultaneous views. By March 2015, Periscope (a dedicated video streaming app) launched, allowing users to share live videos directly from their smartphones. And by 2016, Facebook Live was rolled out for all users.

Alongside the growth of live streaming, this was the decade when on-demand streaming services grew to prominence. With even more kids on the block, it became clear that the battle between proprietary streaming protocols had to be resolved. In 2010, a group of major streaming and media companies including Microsoft, Netflix, Google, and Adobe, got together to create an industry streaming standard. And by 2012, MPEG-DASH was born.

Dynamic Adaptive Streaming over HTTP (MPEG-DASH)

MPEG-DASH is a type of adaptive bitrate streaming that enables the streaming of media content delivered from HTTP web servers. It was designed by the Moving Pictures Expert Group as an alternative to proprietary streaming protocols like HLS. The aim was to create an industry standard for adaptive streaming.

From a technical standpoint, the protocol works in a similar way to HLS (by segmenting content into small chunks and decoding the data for playback). The key difference is that MPEG-DASH is not proprietary, meaning it operates as an open-source standard. The protocol is also codec agnostic (unlike HLS which specifies the use of certain codecs).

How does MPEG-DASH work?

How does MPEG-DASH work?

What is an MPD file?

MPD stands for Media Presentation Description. An MPD file contains the metadata used by the DASH Client to (a) create the HTTP-URLs needed to access Segments, and (b) deliver the streaming service to the end-user.

Web Real-Time Communications (WebRTC)

After Google acquired Global IP Solutions in 2010, the company developed WebRTC, an open-source project for browser-real-time communications. The first version of WebRTC was created by Ericsson Labs in 2011.

The framework was designed to facilitate real-time peer-to-peer communication between browsers enabling voice, video, and data to be shared. However, even though WebRTC was designed as a one-to-one communication solution, many streaming platforms are now trying to achieve one-to-many broadcasts using the protocol.

The WebRTC framework was foundational in advancing real-time voice, text, and video communications. Prior to its creation, platforms like Skype and Google Hangouts relied on native applications and plugins.

WebRTC implemented an open source, plugin-free framework for in-browser communication consisting of a JavaScript API and a suite of communication protocols. The protocol enabled developers to facilitate real-time communication within browser-based software, removing the need for intermediary servers.

How does WebRTC work?

How does WebRTC work?

And although adoption was slow to start (Microsoft didn’t begin supporting WebRTC until 2015 and Apple held off until 2017), it has since become one of the most widely used streaming protocols for high-quality online communications. 

Today the protocol is used by tech giants like Google, Microsoft, and Apple, and WebRTC technology is employed by many popular communication applications (WhatsApp, Discord, Google Chat, etc.). A testament to its success, the global WebRTC Solutions market is forecast to grow from $9.56 billion in 2025 to $94.09 billion by 2032.

Industry spotlight

In July 2025, Meta introduced a new WhatsApp Business Calling API, allowing businesses to integrate voice calls directly into WhatsApp conversations using voice-over-internet-protocol (VoIP) and WebRTC. WebRTC allows business applications to establish direct, real-time audio connections with Meta’s servers to handle the call.

Secure Reliable Transport (SRT)

SRT is an open-source video transport protocol designed to deliver low latency streams over public and private networks and to optimize streaming over unpredictable networks.

Instead of sending data packets over the public internet using Transmission Control Protocol (TCP), SRT uses a packet-based, “connectionless” protocol called User Datagram Protocol (UDP). UDP doesn’t need to establish a network connection before sending data, making it faster than TCP.

Transmission Control Protocol
(TCP)

User Datagram Protocol
(UDP)

Connection-based protocol
Reliable but slower than UDP
Can sequence data
Guaranteed data delivery
Does not support broadcasting
Robust error-checking
Best suited for file transfers, web browsing, and SMS

Connectionless protocol
Less reliable but faster than TCP
Cannot sequence data
Data delivery not guaranteed
Supports broadcasting
Minimal error-checking
Best suited for live streaming, online gaming, and video communications

The SRT protocol maintains video quality in spite of bandwidth fluctuation and packet loss and helps to keep streams secure by using 128/356-bit AES encryption to ensure that content is protected end-to-end from unauthorized parties. 

The protocol was designed by Haivision in 2012 as an alternative to RTMP and was first demoed at the International Broadcasting Convention in 2013. In 2017, SRT was released on GitHub as an open source protocol and that same year Haivision connected with the streaming engine, Wowza, to create the SRT Alliance, a community of industry leaders working towards the continuous improvement of SRT. 

By 2023, YouTube joined the SRT Alliance, marking an important milestone for the protocol and a catalyst for more widespread adoption. Today, SRT continues to be endorsed by major technology partners and broadcasters.  

How does SRT work?

secure reliable transport (SRT)

Industry spotlight

In July 2025, software-based video streaming solution vendor, StreamVX, joined the SRT Alliance in a bid to help advance low-latency video streaming innovation. As part of the initiative, the platform has integrated SRT into its core software stack, using the technology to enable smooth real-time video streaming and interoperability with existing video systems.

High Efficiency Streaming Protocol (HESP)

HESP is an adaptive HTTP based video streaming protocol developed by THEO Technologies (now Dolby OptiView). The innovative protocol was first introduced at the 2019 National Association of Broadcasters (NAB). Achieving low latency, scalability, and best quality viewer experiences had traditionally been challenging for streaming platforms (typically achieving one had come at the expense of another). But HESP offered a way for streaming platforms to achieve all three, even over standard HTTP infrastructure and content delivery networks (CDNs). 

Instead of a segment-based approach, HESP uses a frame-based streaming approach. This involves using two different streams, an initialization stream containing key frames (used when a new stream is started) and a continuation stream (an encoded stream which continues playback after each initialization stream image). Using two streams makes it possible to start playback at the most recent position in the video stream. And because the initialization stream’s images are all key frames, viewers can instantaneously start streams. 

In 2020, THEO Technologies partnered with Synamedia to form the HESP Alliance, with the intention of accelerating the adoption and wider standardization of HESP. The alliance has since been disbanded following the acquisition of THEO Technologies by Dolby in 2024 but the HESP protocol remains in use.  

How does HESP work?

How does HESP work?

2020 and beyond - ASP, ART, MRV2, HESP.live, WHIP, MoQ

Since 2020, streaming has become firmly embedded as a mainstay of modern life. On-demand streaming services have gone from novelty to an entertainment necessity for most, live streaming for work and education has become the norm, and the video game streaming market is thriving. Streaming has well and truly made it to the mainstream and new use cases such as digital fitness, live commerce, and telehealth are taking the world by storm.

Live and on-demand fitness

Exercise in the home is a growing trend with live and on-demand fitness platforms like Les Mills, ClassPass, and Wellbeats making it easier to work out from home. 

The global virtual fitness market is projected to reach $106.4 billion by 2030, with the on-demand market (pre-recorded videos) segment accounting for the largest market share. 

Live commerce

Livestream e-commerce involves the promotion and selling of products and services in real time on digital platforms such as TikTok. 

Asia is a strong market in this sector. In China, live commerce is on track to surpass one trillion USD by the end of 2026. Major platforms include Taobao Live, Douyin, and Kuaishou.

Telehealth 

Technology is meeting healthcare head on and more people are choosing to access healthcare services remotely via virtual check-ups using their computers, tablets, or smartphones. 

The global telehealth market size was estimated at $123.26 billion in 2024 and is expected to reach $455.27 billion by the end of 2030.  

At present, web-based delivery dominates the market with an overall revenue share of 48.5% in 2025

Video game streaming 

Fueled by increased adoption of video streaming apps on mobile devices, game streaming platforms are going strong. 

Twitch remains the largest platform, logging billions of hours watched and more than 21 million active streamers globally in 2025.

Esports (multiplayer gaming played competitively for spectators by pro gamers) is growing especially rapidly. Korea’s League of Legends Championship (LCK) was the most watched Esports event of 2025, racking up over 161 million hours watched.  

In January 2021, Flash was officially discontinued after steadily depreciating usage amongst companies like YouTube, Netflix, and Facebook. Since then, broadcasters and content creators have turned to streaming protocols to share their content. The development of new streaming protocols has continued since the development of HESP in 2019 to meet the changing demands of today’s on-demand and live streaming platforms.

Ardent Streaming Protocol (ASP)

On 25 December 2022, the live streaming and online video hosting platform, YoloLiv, released ASP. ASP is a smart streaming protocol that sends stream data to its own CDN (called YoloCast). ASP stores and transmits more redundant data than other streaming protocols like RTMP thereby reducing loss of quality even under unstable internet connection conditions. ASP is a proprietary technology that is compatible with specific YoloLiv technologies such as  YoloBox Pro (YoloLiv’s all-in-one video switcher, encoder, streamer, monitor, recorder, and audio mixer).

According to YoloLiv, the protocol guarantees “complete and error-free content delivery”. This is possible because video content is stored in a special buffering area within YoloBox Pro for 60-90 seconds. YoloCast ASP then retransmits any media data that is out of order due to network issues so that network disruptions aren’t noticeable by audiences.

How does ASP work?

ardent streaming protocol

ASP isn’t the only new protocol to have hit the scene recently. Adaptive Reliable Transport (ART) and MRV2 are two new streaming protocols currently in their infancy.

Adaptive Reliable Transport (ART)

In 2022 a new streaming protocol called ART was launched. The protocol was developed jointly by Teradek and Amimon and supports real-time and bi-directional streaming between two ART systems over LAN or WAN. The protocol also supports:

  • Point-to-point streaming between ART encoders and decoders

  • Multicast over LAN

  • Distribution from a single ART encoder to multiple ART decoders

One of the most innovative features of ART is that it optimizes for video quality and network characteristics simultaneously. According to Teradek, this enables “lifelike bi-directional streaming interaction between anchors and guests”. 

“ART is without a doubt the most impressive video transport protocol available today [and] no other protocol can deliver broadcast quality, real-time performance over WAN like it” comments Michael Gailing, General Manager of Live Production at Teradek. 

Media Relay Version 2 (MRV2) 

Digital audio technology and advertising company Triton Digital developed the MRV2 protocol as a proprietary streaming transport protocol for use between the station’s master control room and CDN. The objective is to make ad stitching (server-side ad insertion) more accurate and maintain high-quality listening experiences. In April 2023, Triton Digital announced that Wheatstone would be adding the MRV2 protocol to its streaming products and software.

High Efficiency Streaming Protocol Live (HESP.live)

In 2022, THEO Technologies (now Dolby OptiView) announced the release of HESP.live. HESP.live was the first HTTP-based real-time video streaming infrastructure capable of achieving sub-second latency at scale. It uses the HESP protocol to enable live interactivity and, remains one of the fastest HTTP-based video live streaming solution.

WebRTC-HTTP Ingestion Protocol (WHIP)

Developed in collaboration with Millicast (now part of Dolby laboratories) and published in March 2025, WHIP is an open standard defined by the Internet Engineering Task Force (IETF). It simplifies how live media streams are ingested into streaming platforms using WebRTC. Traditionally, using WebRTC for broadcast ingestion required custom signaling logic. Using WHIP, it’s much simpler: an encoder posts a session description to a server and receives an answer, after which media flows over WebRTC to the server. This makes it a modern alternative to older ingestion protocols such as RTMP, enabling near-instant, TLS-protected live stream publishing with minimal setup.  

Media over QUIC (MoQ)

Media over QUIC (MoQ) is a next-generation streaming protocol that leverages modern technologies including QUIC, WebTransport and HTTP/3 to overcome the challenges associated with traditional streaming protocols based on TCP.

The protocol was developed as an open standard by the Internet Engineering Task Force (IETF), with a working group formed in 2022 to standardize the protocol and address limitations in both scalable broadcast streaming and low-latency real-time communication.

What is WebTransport?

WebTransport is a communication protocol that provides real-time, bidirectional communication between web applications and servers. It’s this framework that enables client-server communication over QUIC.

The specification draws on contributions from engineers across the industry and in 2025, Cloudflare launched the first global MoQ relay network on its existing edge infrastructure, marking an important milestone toward practical adoption. The protocol is still being developed as an open standard by the IETF and a final standard is expected to be released in 2026.

Using QUIC’s UDP-based multiplexing, low-latency connection setup, and congestion control, MoQ delivers audio, video, and timed metadata efficiently and securely across unpredictable network conditions. Unlike traditional HTTP-based streaming approaches (such as HLS and DASH) that rely on chunked delivery and often incur multi-second delays, MoQ can achieve sub-second latency.

What is QUIC?

Formerly Quick UDP Internet Connection, QUIC is a modern transport-layer network protocol based on UDP. QUIC offers several benefits compared to traditional UDP and TCP networking connection protocols including reduced latency, error correction, encryption, and reduced head-of-line blocking.

How does QUIC work?

QUIC

With global audiences expecting streaming platforms to work seamlessly and 24/7, MoQ marks a significant milestone for developers and streaming platforms by enabling reliable, scalable, ultra-low latency live and interactive media.

Industry spotlight

In September 2025, real-time video streaming platform, nanocosmos, unveiled an upgraded global streaming infrastructure based on MoQ. The new architecture is the first MoQ-based real-time streaming solution on a CDN foundation that is ready for worldwide deployment.  MoQ enables nanocosmos to provide sub-second latency at global scale, seamless playback and enhanced reliability.

What’s next for streaming? 

Streaming protocols have been changing the way we consume video content for more than a decade now. With the rise of streaming services, online video consumption has surpassed traditional broadcast television. But the evolution of streaming protocols never slowed down.

As demand for seamless content delivery mounts, and streaming platforms diversify into new markets, the streaming industry (and the technology that drives it) has had to prioritize continuous innovation. The industry has continued to strive towards higher levels of interoperability, improved video delivery, and the ability to serve a landscape increasingly dominated by ubiquitous streaming across interconnected devices. 

Latency reduction continues to be one of the most significant innovation drivers, with many protocols having developed low latency iterations (Low-Latency HLS and Low-Latency DASH, for example). MoQ is currently the leading player in this arena, redefining the industry standard for ultra-low-latency streaming at scale.

The influence of AI is also at play. AI-powered algorithms enhance streaming protocols by optimizing how content is delivered over varying network conditions. For example, by predicting future bandwidth availability and network congestion ahead of time or analysing real-time network loads across geographies to intelligently route data.

At time of writing, trade policy and tariffs are also exacting their influence. Tariffs introduced during the Trump administration have targeted networking equipment and data center hardware. As software standards, streaming protocols are tightly coupled to hardware, and tariffs pose a significant indirect impact by creating an environment in which the adoption of cutting-edge codecs, protocols and low latency standards is impaired. At the same time, internet fragmentation and ‘tech decoupling’ is pushing protocol design towards more modular architectures, and data localization laws are starting to dictate protocol design, calling for a move towards stateless designs.

The full impact of these changes is not yet known, but one thing’s for sure: streaming protocols have come a long way since the 1990s. As technology continues to improve, we can expect to see the development of new streaming protocols and even faster streaming experiences.