Synchronizing an editing table or a projector with digital sound

Here are a few methods to synchronize an editing table or a projector with a digital sound device, whether in order to project a finished film or to edit using sound editing software on a computer along with the workprint.

PFirst option : using Film-o-sync and link together the projector and the computer directly.


This MAX based freeware runs on Macs and gives the possibility, after you add a little electronics to the projector, to play a quicktime audio file in respect to the effective running speed of the projector.

This option opens the possibility to show an original film or a workprint along with a stereo digital sound using specific but minimal equipment : a modified projector and a computer. The soft looks good (we haven’t tested it). The electronic device added to the projector is a bit rustic and could possibly be improved (especially in terms of the types of projectors you can modify) using stuff like IR sensors and still be very simple to build. If anyone tries it out..

Second option : using a direct to disk rack like AKAI DR4, DR8 or DR16.


This is another possibility to project a film image and a synchronized digital sound. AKAI DR series were conceived for audio editing and a largely obsolete now, therefore cheap on the second hand market. They can be slaved to SMPTE and even, with a optional card to a bi-phase signal. This is when things become interesting.

A bi-phase signal is a simple electronic signal that counts frames, whether going forward or backwards. The system to produce that signal is generally made of a black metal disc with notches on the outside and IR optoelectronic coupled interrupters. Editing tables that have built-in electronic counters necessarily have that sort of system inside them. Sometimes the manufacturer (e.g. modern CTM tables) thought of including an output for it you just need the right plug. If not, the signal going the electronic counters can be derivated. And if there is no bi-phase signal around, one has to build the device to produce it.

Then one makes a sync mark on the film to define the starting point and each frame is defined by its relative position to that mark..


Note : If the AKAI DR doesn’t have the bi-phase optional card, it is possible to use a BIF interface to produce time-code and setup something similar to what we are about to describe now.

Third option : Produce time-code and trigger sound editing software



To do this, one needs to send the bi-phase signal to an interface like Rosendahl’s BIF that will convert it to a signal the computer can understand, namely Midi Time Code.

Even the simplest Digidesign Protools setups can be triggered that way : you start playing the film on the editing table and the sound starts exactly from the right position. Over time however, the table plays the film at a certain speed (not necessarily an exact 24 fps) and the computer also has its own particular speed to play the sound (not necessarily exact either) : therefore, there is a drift betwenn the two devices. This drift can be very acceptable if you can tune up the speed of the editing table to match the computer’s. Other tables, without offering the possibility to change the running speed, can be very precise. For example, we have used a Steenbeck ST 901 at 24 fps for years and the drifting between that unit and a very basic Protools was only about one second per hour of continuous playing. And again, any time you stop and start again playing the image, the computer starts at the exact right place, so you can always check an image/sound sync by going just before that point, stopping and then pressing play.

Fourth option : Perfectly synching up image and sound using time clock and clock reference



To go one step beyond in synchronisation, you need to send on top of the time code of the starting point a clock reference signal that the computer will use varispeed the sound in order to follow exactly whatever the table / projector running speed is (that’s within a certain range of the nominal speed…). In this case, you just have to indicate to the sound software that you don’t want to use the internal clock as a speed reference but an external source.

With such a system, sound and image are perfectly locked together. The MTC gives the positional information and the Wordclock signal the speed.

On the above schematic, two interface are used to achieve that goal : the LTC time-code is send to another Rosendahl interface, the WIF, that produces a Wordclock reference signal. As the MBox 1 doesn’t understand Wordclock, another interface is needed, the BEHRINGER ULTRAMATCH that converts the Wordclock into a S/PDIF signal that can be sent to the MBox as a sync reference. (The ULTRAMATCH SRC 2000 works at 44.1 kHz ; the more recent but more expensive to find SRC 2496 can work at 44.1 kHz, 48 kHz and 96 kHz).

It’s also good to know that some audio interfaces can directly take Wordclock as a clock reference. It is the case of the MBox 2 Pro for example (that’s to stay with Digidesign, but there are other brands !). The signal coming out of the WIF can then be sent directly to the sound interface.

Or use a Digidesign USD to make both an MTC time-code and an LTC SMPTE code that is used to generate Wordclock through a MOTU Midi Timepiece. MTC and Wordclock are used to feed a DIGI003 sound interface. This works well, too, but make sure the biphase signal that you are using is at minimum 2 pulse per frame, which isn’t the case of the biphase used to feed the counters of a Steenbeck, for instance. On the diagram, a CTM-Debrie HDC table.

Setup of the MOTU MTP

CURSOR DIGI (= 256x)