Connectivity in the Cloud
While many aspects of cloud-based live production resemble familiar elements of traditional broadcast production, one difference is unavoidable: You need to connect to these elements over a network — namely, the internet. The user interfaces that you use to produce a show may look similar via the cloud to how they look in a hardware-based workflow, but the signal flow is different. And broadcast engineers understandably have some questions about that.
Whether it’s how to get video from sources to production systems, what to expect in terms of delay, or how to manage synchronization, those are big questions. Fortunately, they are questions the industry has largely addressed — through advances in using IP networks for real-time media delivery and through advances in cloud technology — to enable cloud-enabled live production.
Choosing the Right Streaming Protocol
With the move to a cloud-based production platform, the big shift that engineers and other technical staff need to make is all about connectivity. If you can’t run a plug from a camera to the truck anymore, how do you get your video source in the cloud? Today, you have a variety of streaming protocols at your disposal, and you need to select the protocol that best addresses your network conditions and other technical and performance requirements.
- Secure Reliable Transport (SRT), for example, is an open-source video streaming transport protocol that delivers secure low-latency streaming performance over noisy or unpredictable (lossy) networks, such as the public internet, and enables easy firewall traversal. In other words, SRT can deliver high-quality video over the most problematic networks. Imagine using your mobile phone to send video from the middle of a field somewhere. You might choose SRT for that.
- Reliable Internet Stream Transport (RIST) is another open-source, open-specification transport protocol designed for reliable transmission over lossy networks, and it navigates the network to deliver a low-latency, high-quality video stream.
- Real-time Transport Protocol (RTP), conversely, is designed for end-to-end, real-time transfer of streaming media, and it is regarded as the primary standard for audio/video transport in IP networks. It’s optimized for consistent delivery over a high-end network. When you know you can count on exceptional and consistent network performance, as you might in a controlled facility setting, you can go with RTP.
- Zixi protocol adjusts responsively and dynamically, with error correction, to fluctuating network conditions. Thus Zixi is often the preferred choice for error-free video streaming over IP, especially over long distances with less than ideal network conditions, such as public WiFi.
Many video capture devices already support these protocols and formats natively. You can find streaming-capable PTZ cameras and high-end broadcast cameras ready to go; all you need to do is make a few configuration settings to connect them to the network and to the cloud production environment. Aside from cameras with built-in streaming functionality, you’ll find plenty of proven — and continually improving — software and hardware encoders you can use to stream video from legacy cameras.
Rather than run your video to the truck, you run it to a box that sends it over the internet to a cloud production platform. Depending on your price point, technical requirements, and number of inputs required, you can find a competitively priced, professional-grade streaming encoder, with options for handling synchronization and helping to minimize delay.
Managing Delay and Synchronization
One good thing to remember about the world of live streaming is that while there are things you can do to optimize the path and reduce the total delay, there will always be some inherent level of delay. So, if your production use case absolutely requires a near-real-time output — the two frames of delay that you get from a hard-wired setup — then cloud-based live production platforms likely aren’t for you. (If you’re in a sports venue trying to feed a large video wall for fans as they watch the live game, it’s not ideal!)
But for so many other streaming applications — and the pandemic really proved this to be true — online audiences have shown that easy access to high-quality content takes precedence over a small delay. One way to keep that delay small is to use the best network available to you. If you can push content from a studio with ultra-high-speed internet with loads of bandwidth, with direct ethernet connections to all your cameras, and you use RTP or a similar protocol, you’ll be able to optimize your connectivity. If, on the other hand, you have to rely on your mobile phone and a protocol such as SRT, you’ll probably be OK; you’re still getting the benefit of a protocol designed to make the most of the network you’ve got.
When it comes to synchronizing your video feeds, your best and most reliable option is upstream synchronization at the encoder. The Matrox Monarch Edge is a great example of a streaming encoder that takes multiple video inputs (up to four) into one box and then creates a time code so that everything is streamed out in sync. As a result, all four streams enter the production environment synchronized. (Problem solved!) An alternative approach is to rely on the production platform itself to detect the delay on the incoming streams (based on the network connection) and then synchronize all streams so they are essentially aligned with the slowest (or latest) incoming stream.
Ultimately, the approach you choose should match your performance requirements. You may not need a streaming encoder that can synchronize your streams, but you nevertheless should try to use the same type of devices and to aim for the “same pipe, same network” approach to sending video to your cloud-based production platform.
Why You Should Try the Cloud
If conventional broadcast workflows have served you well until now, why change? Why concern yourself with streaming protocols, optimizing your network, and so on?
In short, cloud-based live production gives you the opportunity to do more with less. You don’t need to show up with tons of heavy cameras and use hundreds of feet of SDI cabling to run it back to a truck. You have more flexibility in using new devices, such as mobile phones and smaller streaming cameras, and you can use existing camera equipment and use it to drive remote and streaming productions.
Once you wrap your head around streaming your cameras into a cloud environment, you can think about live production without sending a huge mobile truck somewhere. You don’t need to ship fly packs all over the country or to different continents. You just send cameras and people to operate them — and with some cameras, you can control them remotely or let them automatically track the action. When you add it all up, you can do a lot more with a lot less, and spend a lot less to do a lot more.