Líder de inovação em tecnologia de vídeo ao vivo vê um aumento superior a 200% no uso de SaaS, bem como mudanças aceleradas na entrega de rede IP e produção remota ...
Our website's menu is temporarily down for maintenance. In the meantime, please find any info you need through our AI Assistant.
By Paul Shen, TVU Networks
September 3rd, 2020
There’s a common misperception about producing live remote television in the cloud: latency makes multi-camera synchronization impossible.
While latency can be a serious problem, it’s simply too general of a statement to say the cloud creates delays that make switching between camera sources and doing all of the other things need for live remote TV production too imprecise and therefore impractical.
I’m here to tell you that not only is zero delay remote production possible, but also that it is the key to unlocking the true potential of at-home or REMI (Remote Integration Model) production. That in turn makes it possible for broadcasters and other video producers to create more content more affordably and thus compete and win as viewers evolve their video consumption habits.
Defeating Delay With Delay
It might seem like a paradox, but the way to defeat delay when producing live television in the cloud is with delay.
To be sure, transporting video IP packets across a LAN and ultimately via the internet to the cloud can introduce jitter. Network congestion or its equivalent at router interfaces are likely culprits. The fact multiple camera sources are in play for most live video productions further compounds the issue.
Not only can each live source encounter its own network jitter, but live cameras can drift in time in relationship to one another making normal switching between shots difficult or even impossible in the cloud.
But there is a rather straightforward way to defeat this delay, and it involves introducing a sufficiently large enough delay between, for instance, the action shot on the field of play and when that shot is played out for distribution to accommodate all of the network jitter and other delays.
Achieving Multi-Camera Synchronization
If you’ve been involved with video production long enough, you may remember a device that was once critical to A-B roll editing called a time base corrector (TBC). I’ll spare you all of details, but at a high level, a TBC took in a video signal from a videotape player (one TBC for each player). Those tape machines could not be relied upon to play-out video with precise timing.
The TBC served as a digital buffer for incoming frames of video. Its precision internal clock enabled a buffered video frame to be read out line by line with perfect timing so that video from the A source and the B source tape players could be in perfect sync and switched between by a production switcher without creating anomalies like rolling bands in the video.
It’s necessary to create a time buffer of sufficient duration to accommodate these problems
Fast forward to today. Different technology, but the same strategy. To defeat the time anomalies that result from jitter, drifting cameras and other sources, it’s necessary to create a time buffer of sufficient duration to accommodate these problems. In other words, intentionally create a delay.
Further, by assigning a time stamp to each frame of video and associated audio track, it’s possible to sync time stamps from multiple cameras inside this intentionally created delay buffer thereby enabling in the cloud all of the video and audio production processes –like switching, slow motion replay and rolling in pre-recorded clips—required for a production.
Viewers at home are none that wiser that a delay was intentionally introduced to accommodate the realities of producing video in the cloud. Further, when compared to all of the other delays introduced into the traditional production and distribution chains needed to deliver a live production to the home, this delay buffer for cloud production is inconsequential.
The Opportunities
What cloud-based live video production offers is the ability to take at-home (REMI) production to a new level. Rather than transporting remote camera feeds in the form of IP packets from a venue like a sports stadium via the internet to a centralized production center –thus the term “at-home”—the cloud-based methodology enables directors and technical directors, graphic artists, slow motion operators, audio engineers and everyone else involved at that centralized production facility to be at home, literally.
The benefits for broadcasters and other video producers are obvious: lower-cost production
The benefits for broadcasters and other video producers are obvious: lower-cost production, minimal travel to remote sites, more remote events covered, more content produced and –perhaps the most important enabled by the cloud—the ability to enlist the best production talent regardless of where they are to produce a show.
All of this is made possible by leveraging production in the cloud for live events, and that’s made possible by applying this strategy to make zero delay remotes a reality.