Multi camera editing workflow is a challenge in any application. I’m familiar with several video editing packages, but the one I thought was best suited for the job of editing my interviews was Premiere Pro.

In this article I’ll explain some of the pre-production considerations and thoughts I went through on Creative Minds. It involves the software and the hardware I’m using, and the overall workflow for editing the episodes. Just like with web design, one thing determines another, and you have to trust that all things come together in the end – even if you don’t know the exact solution to a problem.

History

I’ve been editing video since the mid nineties, when I started working in television post production. Those were tape-to-tape environments, both analogue and digital, and the cost and complexity does not compare with the consumer grade tools we have available today. These days, anyone with a 5 year old mobile phone is a video editor. Oddly enough, you have to be in order to draw attention to your Instagram feed.

Having the right tools however doesn’t automatically make you good at using them. If you’ve ever bought a guitar or a pencil, you’ll know that it doesn’t turn you into Pat Metheny or Pablo Picasso. You need to know how to use the tools, and you need to know what the the right tool for the job is to get it done properly.

Editing my Creative Minds episodes presented a similar challenge, and a complex one at that: I needed the right multi-camera editing tool to intercut a live interview on my PC, and I needed powerful enough hardware to do it. I also needed the right tools for recording multiple streams to begin with. That aside, I needed to know how to use those tools, as well as what to do with them to end up with something people want to watch.

Project Premise

I knew that my interviews would involve multiple camera angles, no matter if we’d record them at the same location remotely. In a live television gallery such angles would be switched during the interview, using a Grass Valley or Sony vision mixer. The person doing this would either be the director, or a dedicated vision mixer.

Thankfully I have some of experience doing this, having been involved in many a live TV production during my broadcast television career. I used to vision mix for the Financial Times back in 2001m when they were covering shareholder events of large companies in London. Those would be streamed to investors around the globe, and also recorded for distribution via VHS and DVD. That was 4 years before YouTube had arrived, and long before streaming video was as commonplace as it is today, due due lack of fast enough data lines.

My setup was going to be very similar to that of alive gallery, with the exception that all video feeds would be played back during the edit, rather than happen live. I needed a tool that would support this feature, and lucky for me I knew one did.

The Tools

I had an old version of Adobe Premiere Pro CS5.5 from 2011. I’ve been using this app since the late nineties and have been using it professionally for several freelance jobs back in London. It’s been my constant companion, but it’s Mac only, perpetual license, made in 2011, and the macOS operating system has moved on dramatically since then. Shortly after that, Adobe introduced the Creative Cloud subscription, at which point I decided not to upgrade anymore to save some cash. Sadly my ageing Mac Mini from 2012 is not up for the challenge of editing multi camera sequences, so I needed a new copy of Premiere Pro that would work on my much more powerful Windows hardware.


I decided to take this project as an incentive to subscribe for $20.99 every month, in addition to my Photoshop subscription for $9.99. While that sounds expensive, it is quite remarkable that we can now afford the tools that in the nineties and beyond were pretty much impossible for ordinary humans to consider buying.

Editing Workflow

After some tests during the trial period, I was satisfied that my hardware was in good shape for the workflow I had in mind. For remote interviews, this would entail several camera streams that were recorded simultaneously, and when played back would allow me to switch to any feed I needed.

Next I setup a date with Brian Cramer for a technical test, so that I had the footage I would end up with on a real interview, and see if the live editing process worked as I had envisioned it. As luck would have it, we were both free only three days later and taped the interview, and it turned out rather well. Although I had forgotten to ask a few things I only thought of after the fact, it turned out well enough to become part of the series.

When we were finished, I ended up with essentially 4 separate feeds of video, and one feed of high-quality audio. All these need to be synced up to play at the same time. This used to be a cumbersome process in the past, by of first finding a visual point of reference (like a clapperboard or slate, or a simple hand clap), and matching it up with the audio spike that can be seen as a result. Timecode is another way of synchronising the feeds, but that only works if it’s frame accurate and it wasn’t in my case. Premiere offers a neat feature in which it can look at all audio feeds associated with the video and sync the footage up that way. It only works though when each video feed has a matching audio feed that is in itself in sync, which isn’t always accurate. Some Skype feeds are out of sync, some GoPro feeds are off by a few frames, so I had to shift those appropriately until I have Premiere match all feeds together.

Once all that prep work is over, the next step is to watch the interview in real time, while switching camera feeds at the appropriate points.

As if the above workflow isn’t complicated enough, sound represents yet another challenge in the puzzle. I’ll talk more about it and the recording process during remote interviews in the next article.

Leave a comment

Your email address will not be published. Required fields are marked *