A Look Back at Moving Forward: How We Kept Canadians Informed & Managed the 2020 Lockdown
This article originally appeared in the October 2020 issue of Professional Sound.
By Francois Turenne, Roberto Capretta, Ron Searles, TJ Heideman, Dinu Cebzan, Andrew Roberts & Chris Jackson
Since the beginning of the COVID-19 pandemic lockdown, the Canadian Broadcasting Corporation has kept Canadians informed while adapting to new and innovative technological workflows.
CBC’s Broadcasting Centre in downtown Toronto is a hub for the public broadcaster’s English Service news coverage. It is home to our 24-hour News Network channel and the centralized control rooms that support key English local television news broadcasts across the country. In normal times, we have a sizeable open-concept newsroom where dozens of writers, producers, journalists, and support staff assemble the stories of the day, while we broadcast live using two open-concept studio floors, serving our news channel and the main network.
Starting in early March, our flagship program, The National, asked the operation to rethink the way we produce daily news, and to focus on enabling as many newsroom staff as possible to work safely from home. When COVID-19 hit, we had to fast-track a communication system that would allow an undetermined amount of people to accomplish tasks from home which they normally would do from their desks, or from any of the eight TV news control rooms broadcasting from regions across Canada.
Our news operations rely on six Toronto-based RTS ADAM intercom matrixes, totaling 2,344 ports, which are linked through RTS’s Trunk Master Infrastructure with RVON (RTS Voice Over Network) to additional RTS matrixes at CBC facilities across Canada.
Our staff are well versed in communicating over RTS’s KP key-panels in the control rooms or at their desks. From there, they can interact with on-air talent and anyone else who is part of our giant RTS network. When the work-at-home model became an ongoing necessity, we needed to find a quick solution utilizing our existing intercom infrastructure.
Over the last few years, we’ve had success integrating Unity Intercom – an app-based intercom system that uses regular internet for transport – to our larger RTS intercom, allowing individuals in remote locations to communicate with the production crew. Unity is an attractive option because it can run on smartphones and laptops that are already in use by staff. The installation and set-up is easily done in the field with only a few instructions. Best of all, it sounds great!
Our typical set-ups with Unity had been relatively small-scale, with a maximum of six intercom channels and rarely more than a dozen users. The question was: could Unity be scaled up to handle a radical shift by moving hundreds of news staff to their home offices? CBC Studios and our partners in Engineering were able to upgrade a license for Unity to provide us with unlimited user accounts and an accessible work-from-home solution.
Currently, our Unity system includes a 64-channel Dante interface device, which is
able to integrate through a secure network, with any of our RTS Omneo/Dante cards. We can build up to 64 discrete communication paths with our existing intercom, plus users can privately call anyone logged in on the Unity server. An early challenge was to create a way for Unity users to effectively talk to hosts on IFB.
Normally this is a key pressed on an intercom panel that temporarily interrupts the host’s program sound foldback so that they may hear instructions from the control room. The solution was to dedicate specific channel paths for IFBs and, within the RTS system, program a logic statement that tells the intercom: “When you hear sound on channel X, turn on IFB Y.” Our housebound producers can now have two-way conversations with on-air hosts in real time.
With over six months of use and over 150 active users, we have crafted a flexible and cost-effective solution for this very sudden shift in operations. We are expanding our use of this type of remote intercom system on an ongoing basis, and now that so many users have grown to depend on it, it will remain a key part of our communications far beyond the current pandemic.
The COVID-19 lockdown had an enormous impact on content production everywhere. In response, CBC launched a new show very early in the lockdown called What’re You At, hosted by Tom Power from CBC Radio’s q. The show, which aired on the main network and online on CBC Gem with interviews edited later for CBC Radio, talked to a variety of celebrities, musicians, and first responders about their lives and how they were dealing with the pandemic. With the use of our Unity technology, our host had the ability to interview guests using various social platforms such as Zoom, Skype, and Facetime, right from his living room.
The control room utilized one CPU per guest, to a max capacity of five systems. A resources producer would call each guest and walk them through basic and advanced settings for their video and audio. This required knowledge of all the possible devices, OSs, and versions so we could guide everyone through the settings we required. Every effort was made to shut off most of the compression technology and disable anything “auto” in the applications. For musicians, we disabled everything in the advanced settings, since normal app settings are generally quite poor for broadcast.
Back in the control room, we used a Cedar DNS8Live for noise reduction that we had control over to help clean up unwanted sounds. Getting the guests to use an external microphone, if available, was always a better-sounding solution. When possible, we sent some guests a microphone to use in advance. Of course, the audio team always strives for the best possible sound quality while directors often want a wider shot with guests distanced from their screens – this is not ideal for sound quality when using a laptop’s built-in microphone.
RTS RVON was installed in Tom’s home and connected via a port into the CBC network with a Fortinet Gateway and a VPN tunnel to the Toronto Broadcasting Centre network. A small mixer, two microphones, and a return video monitor for Tom were installed along with robotic cameras that we could frame and shade remotely. Our video specialist was provided an RVON KP intercom panel to communicate with the control room from their home, where they also controlled the cameras. Through this VPN gateway, they could see full-resolution video and hear the audio. Tom monitored audio mainly through RVON, with LiveU as a backup and his cell phone as a third backup through a telephone hybrid.
LiveU transmission introduces approximately two seconds of latency, which creates issues when communicating with the guests. This is why an IP-based solution such as RVON was used for Tom to monitor and also to return his mic with almost zero latency to our control room, which we could then bus to the guests. This meant Tom and his guests communicated to each other with the smallest possible delay.
Our Calrec Apollo allows for four different direct outputs on every channel, each with their own delay, making it very flexible – especially as we were also isolating for CBC Radio, which did not reference any picture. The mics were each delayed to sync with the picture, but at a point further down the I/O path so that Tom and his guests did not hear those delays; furthermore, sync would drift on many of these apps. The audio engineer would settle and find the ideal sync early, but not change it during the interview so our colleagues in post would not have to blindly chase which sync alterations had been done.
Musical performances were recorded in advance, but the technical quality varied even during a performance, so we decided early on to ask the artist to record themselves and send us the file, which the audio post team would sync up later. The goal was always to achieve the best possible live performance and the guest artists did a great job recording themselves. For those reasons, the show was sent to post-production for a final edit, as it could not be aired live.
A note: The Juno Awards were delayed due to COVID in March and later produced virtually, in a similar fashion with the same challenges present.
CBC’s post audio engineers have been continuing their work through the pandemic, working in suites that are continuously cleaned and sanitized between shifts. Audio engineers Don Dickson and Chris Welsh have been tag-teaming in-studio mixing of CBC’s communications work (on-air program highlights and promos) while their colleagues Rick Starks, Jody Ellis, and Ron Searles have been working remotely from their home studios, mixing for The National, What’re You At?, CBC Arts, and CBC Kids. Much of the in-the-field work they receive is shot and recorded on iPhones or other simple setups. iZotope RX 7 has been important for some audio cleanup, but it’s been a surprisingly smooth workflow.
Files are exchanged through secure Google Drive links from CBC’s work-from-home video editors, and approvals are also done by work-at-home producers with access to the same drives.
Well before COVID-19 affected the way we work, CBC Studios has been on a steady path towards fully remote production facilitation. Our history of remote productions dates back to the 2008 Summer Olympic Games in Beijing. For those games, all our technical operations were centralized in Toronto for the English broadcast, providing us an incredibly cost-efficient way to produce the games from Canada. Since that time, CBC has continued to develop workflows and invest in technologies that allow us to produce content, even when we can’t all be together in a single location.
CBC Studios and CBC Sports needed to deploy new tools that would allow our producers and colleagues to view feeds and program outputs with very low latency and high quality. To achieve this, we relied on the Haivision-developed technology, SRT. We deployed Makito encoders to allow producers to view feeds at sub-second latency at home. Once partnered with our Unity Intercom for communications, it all provided great results! We’re also experimenting with WebRTC solutions for both incoming and outgoing feeds to get even lower latency and wider device compatibility.
One of the most potent tools we’ve been able to deploy is the LU-Smart and LU-Lite apps from LiveU. To optimize our workflow, we’ve recently installed several new LiveU decoders, which have allowed us to bring in a multitude of remote signals from a wide variety of devices and provided us with greater flexibility.
A recent comedy special is a perfect example of the flexibility and power it’s given. Our host was at her home with a LiveU LU600 transmitter, two Panasonic PTZ cameras, and a Behringer X32 mixer. Then, we mixed in over a dozen iPhones and Android devices from other comedians all across North America who each shot with two smartphones. The PTZs were controlled remotely, and all control of the X32, including IFB, was operated from Studio 42 at the Broadcasting Centre where all sources were cut live. Having multiple PTZ cameras and smartphones capturing different angles is something we did differently than most broadcasters on all the COVID-produced shows we aired.
A group effort was organized between CBC and TFO, where we produced Le bal (dé) masqué, a cutting-edge live graduation that featured two hosts, multiple live guests on smartphones, a DJ, and live performances. This was a fully-produced, French-language show that streamed live and was later re-cut and packaged for a radio show and one-hour TV special that aired on SRC.
Other technology we will use for similar shows in the future includes the Calrec RP-1. This is a remote DSP console and Hydra network “in a box.” It can physically be thousands of miles away and be controlled via a web GUI or the host console it is partnered with. Control data latencies are dependent on internet speeds and physical distance, but are routinely well within acceptable parameters. As an example, latency between Toronto and Ottawa was 7 ms.
We first used this setup for our domestic Canada Day 2019 broadcast. We also utilized this for our remote Dragons’ Den locations in Vancouver and Edmonton for the upcoming season, as well as the next Olympics where the RP-1 will control the microphones and IFBs (monitoring for hosts) and all controls on the Apollo. All mics and IFBs will get muxed/demuxed into SDI camera feeds at the location in the RP-1. This eliminates the potential of latency between picture and sound at the source.
For larger setups, any of Calrec’s Hydra I/O can be used with the RP-1, including analog, MADI, Dante, AES3, and their upcoming 2110-30 AoIP offering. The virtual faders and mutes of the RP-1 are controlled in real-time (plus latency mentioned above) by the host broadcast console, giving the talent at the remote site an experience equal to that of being in the broadcast studio.
There’s always a learning curve when working with talent unfamiliar with broadcasting, but this has been compounded with the need to collaborate remotely. We’ve had to get creative with shoe boxes, dictionaries, and other items around the house to help guests frame up their shots, explain the ins-and-outs of all the apps that we’ve deployed, and also frequently explain how we can’t break the laws of physics when it comes to latency. This brings us to the number-one piece of advice we’ve given over the past few months for anyone doing video at home: Turn off your Wi-Fi and plug into ethernet!
We are building as the technology is expanding, and it has provided us with a tremendous amount of learning as we go. While the CBC has been using the REMI (Remote Integration) model for a number of broadcasts since the 2014 Sochi Olympics, the technology itself is approximately 10 years old. The concept behind the REMI model is to extend your broadcast centre to the remote venue location by using a fly pack or mobile truck. The purpose is twofold; one is the ability for one production crew to produce multiple events while staying in the broadcast facility and the other clear advantage is cost savings. The fewer crew needed to travel, the less expensive a production becomes.
The REMI truck was designed essentially as a patch bay extension from the Broadcasting Centre, connecting to CBC’s internal data network. Initially, it was going to be a “mini truck” with two operators – one patch person and one engineer. We were able to repurpose a truck from Radio-Canada and redesign it for television. Since we were re-building the interior, we included traditional mobile production positions, to allow for three different modes of use: local production as a traditional mini mobile, a full REMI production with two engineers and the rest of the control room and production crew at the Broadcast Centre, or a hybrid of the two.
For the past two years, the Remembrance Day Ceremonies in Ottawa were produced using the hybrid model. The 10-pool camera production “host feed” (the ceremony images all networks broadcast) was handled locally as a mobile production, while CBC’s domestic feed with on-air booth with three cameras and two hosts was produced remotely in Toronto. All of this was possible thanks to the design and the flexibility of the technology onboard.
When we were presented with the challenge of COVID-19, the REMI model was in regular production for CBC Toronto. We only require two people to run it, so we can create a bubble and socially-distance the crew. For example, we are able to produce a show by isolating crew outside the truck (camera ops, field audio), with the two engineers in the truck and the rest of the crew in the Broadcast Centre. These isolated “zones” not only protect staff from cross over, but ensure required distancing and safety. The REMI model gives us the ability to manage all production requirements while maintaining standards and maximum quality.
During the COVID-19 period, the entire industry has reexamined its approach to production, finding ways to produce content safely and with social distancing. There are many factors that need to be accounted for with each production, and a great deal of time and planning was needed to ensure our teams were safe and successful as we resumed production. CBC Studios worked in partnership with our productions to find innovative ways to support content for our formatted and established shows, including Dragons’ Den and Family Feud Canada. This resulted in a few seamless changes to our operation for the shows in order to achieve the desired result.
Maintaining social distancing while making television is not easy. Studio load-ins and crew set-ups are quite different during the pandemic. Crew are pre-assigned an elevator they must take within a designated time, and overall traffic in our production spaces is tightly managed. Once in their assigned position, crew are encouraged not to leave their positions unless absolutely necessary to keep everyone in the studio safe.
Shows that would normally host an audience of 200 people and more than a dozen people in the control room will change for the foreseeable future. At the time of writing this article, no audiences will be allowed for Family Feud Canada, which may change the energy and dynamics of the show during tapings.
Dragons’ Den presents its own challenges. The Dragons will now all be six feet apart, not side by side. Some of the pitching entrepreneurs who would normally be onsite, with up to 12 pitches being shot per day, will be handled much differently with less interaction with crew and production. All the props for the pitches will be thoroughly cleaned after entering the building before they can go on set. Only pitchers from Ontario and Quebec will physically be here; the rest will all be done as remotes.
Our future path is committed to AES67-SMPTE2110-30 infrastructure and solutions. Our planning is based on a future where more devices can communicate between each other in an all-IP world. Through all of this, our goal is to keep Canadians informed and entertained with the best-possible production quality and – most importantly – keeping everyone safe.
This article features contributions from CBC’s Francois Turenne, Roberto Capretta, Ron Searles, TJ Heideman, Dinu Cebzan, Andrew Roberts, and Chris Jackson.
PS: Give us an idea of how you and your colleagues initially came together in light of the COVID-19 pandemic and figured out how you’d be able to continue working to the best of your abilities while keeping people safe. What did that process initially look like?
David Allmark: From a technical standpoint, we had been pushing for the last three years or so to do a lot more IP-based production and REMI production, so this really was an excuse to jump in with both feet and really get it going. We had a lot of gear that we’d purchased and were trying to move our production groups towards it. When the pandemic came, there was no choice; we had to do it, and we were lucky to be fairly ready. We knew very early it was going to be virtual, and we didn’t want anyone in the building at all if possible, so we toyed with all kinds of options in terms of, could we control all of the gear from people’s houses and that kind of thing.
Actually getting the gear out and relying on [staff members’] internet at home seemed a bit daunting, so we tried to create as small of a technical crew as possible [at HQ], so we pretty much got our crews down to three people – an audio person, switcher or technical director, and myself – and kept all of our production staff at home. We were able to do that by streaming multi-views to their homes, so all the producers ran with at least two streaming feeds to see everything we were seeing in the control room.
A lot of our initial content was pre-recorded and we would send it off to be edited, so our editors were working at home and we had some co-productions with other outside companies, so we were gathering all the pieces and then sending them to be edited via FTP transports that are part of our workflow anyway. What was interesting was we had a lot of issues with people’s internet dropping – and these are people with good, high-speed internet, so we moved towards using Wi-Fi with a little bit of cellular to boost it, and that’s how those systems worked best.
PS: Tell us more about that process of working with external guests or performers that you were bringing into your world. Did you get settled into a workflow where you could walk people through best practices to ensure you were getting the type of content you needed?
Allmark: Yeah. We learned really quickly what wasn’t working well [laughs] and that was the audio quality from Bluetooth headsets. It seems the compression with Bluetooth is not very good. Take the AirPod microphone compared to the standard Apple wired headset – night and day in terms of sound quality. So we asked people to dig out their old white headphones because they sound way better. We had to do a lot of post, of course, but a lot of those raw recordings were coming from Apple headsets and actually sounded really good.
PS: Looking ahead to, say, the rescheduled 2021 Summer Olympics in Tokyo, considering things are still changing almost daily, how are you and your colleagues approaching your plans for this and other such events that still have so many unknowns?
Allmark: The pandemic has taught us that nothing’s going to be normal and we can never assume we’ll be able to do things like we used to. For the most part, we’re still waiting on a lot of answers from [Olympic Broadcast Services] on how we’re going to move forward, so we’ve had to put together multiple different scenarios so we have them ready to go, depending on what we’re told and the information we get. People are also worried about a second wave. Is that going to come? Maybe. Maybe not. If it does, we’ll have to come up with different plans; if it doesn’t, we can probably move ahead with some of the plans we have now.
PS: Which new components of your pandemic workflow do you think might be useful to operations going forward?
Allmark: One of the big things has been finding encoders and decoders that are extremely low-latency. Moving forward, our big focus is getting latency down as low as possible. It helps everybody, especially when you’re using multiple encoders and decoders to bring content together.
PS: What are one or two of the biggest lessons that aren’t specifically technical that you or your colleagues have taken away from this experience that you think will help you to do your jobs better even after we’re on the other side of this ordeal?
Allmark: A big one for us just recently is we’ve temporarily put a whole bunch of computers in all of our control rooms to be able to accept Zoom calls, Skype calls – anything you can imagine. We had an option to use OBS [Open Broadcaster Software] and had to figure out how to integrate that into our regular control rooms where we’d never used OBS before. Basically, nothing’s off the table; we’re looking at everything and have to be as flexible as possible.