Exposing the Inner Workings of Streaming Video

Exposing the Inner Workings of Streaming Video

What many people outside of the industry don’t understand is how complicated it is to deliver streaming video. Unlike broadcast television, which is based on standardized, interoperable technology, much of the streaming video stack needs to be connected together in an ad-hoc fashion through APIs or customer integrations. And there are numerous pieces of technology in the streaming video stack that aren’t even present in QAM-based workflows: watermarking, content delivery networks, caches.

 

The Complexity Is More Than Just Technology

All of those technology pieces within the streaming stack represent something else as well: dozens of discrete sources of data, all of which are needed to understand the cause of quality-related issues. This is another key distinction between streaming and broadcast. Because the technology within broadcast is interoperable, operators can trace issues from source to set-top box. That’s because there is a system to manage all of that data. Not so in streaming video. Unfortunately, the lack of standardization and interoperability between streaming components (something the Alliance and other groups are working to address) means that each component might have its own system to monitor its data. Depending upon the number of components and technologies, that could be dozens of individual data sources and dashboards, none of them connected. This makes it increasingly difficult to ensure a high-quality QoE especially when you don’t have an initial idea of all the players involved and the data they provide.

 

Building Your Streaming Video Datatecture

Datazoom (an Alliance grant member) has recently launched the Streaming Video Datatecture. This market landscape of technologies and providers, spread across dozens of categories in Operations, Infrastructure, and Workflow, catalogs the various components of a streaming video technology stack. Of course, not all the categories will be employed by every streaming platform, but the categories largely cover the entire spectrum of possible components.

 

But this project and diagram isn’t just an attempt at another Lumascape. It’s not about understanding the market players. It focuses on the role each plays within the larger data picture. That allows the streaming operator to understand what part a technology plays in ensuring a broadcast-like experience, in monitoring the overall impact of any component on latency and QoE, in root-cause analysis.

 

And the ultimate power of this creation isn’t in what’s been presented on the Streaming Video Datatecture website. It’s that this is a model. Streaming operators can use it as a map to create their own datatecture, their own model of the data components within their streaming technology stacks. When combined with the flexibility of a monitoring harness, a programmatic approach to monitoring that can accept any component with API hooks, changes to the technology stack, such as swapping one component out for another, doesn’t have to interrupt providing a consistent and reliable service.

 

Transparency is Critical to Success

Streaming operators need the kind of transparency that standardized, interoperable technologies provide. They need a technology stack like broadcast has. Known and definable. Unfortunately, we are not there yet in the streaming industry (and it’s questionable if we will ever get to a point of standardization, which is why the Alliance focuses on best practices, guidelines, and specifications). So we need ways to make the inner workings of streaming video more transparent to facilitate a more broadcast-like, or better-than-broadcast, experience. That’s why the Alliance applauds this effort and Datazoom for giving it to the industry. We need more free, open resources like this (and those that the Alliance provides) if we ever want to ensure the streaming services of the future are better than the television broadcasts of today.

 

Executive Director at | Website

Jason is the Executive Director of the Streaming Video Technology Alliance, the international technical association for streaming video which brings companies from across the streaming ecosystem together to collaborate on technical solutions to delivering high-quality video at scale. In this role, he runs day-to-day operations, finances, member recruitment, strategy, and evangelizes the organization at events around the world. He is also the co-founder of a big data startup, datazoom.io. Jason is a contributing editor at Streaming Media Magazine and has written several books.