pyrotechnician
Member
Hello,
I was reading lately about the use of timecode in movie production. I was surprised to learn that the external timecode does not actually synchronize cameras; it only gives the starting time. Once the camera is rolling, frames are counted up from that starting point onwards; purely based on the recording speed of the camera, without any influence of the timecode. So, on longer takes, cameras may drift apart from each other. The same goes for external audio recording, where the starting timecode is put as metadata into the file; but once its recording, its free running.
To solve this issue, movie productions do not just use timecode for longer takes, they also genlock the cameras to be sure they run at exactly the same speed. Audio recorders are then feed a wordclock signal coming from the same source, so that everything stays in sync.
That got me thinking about media servers. I have the impression, that the same basic problem exists. So, a media server can start a video clip at exactly the right timecode, but once it is running, it comes down to the playback speed of the graphics card. For longer video clips, you would probably have to genlock the media servers and the timecode generator to insure sync. Is this ever done? So far, I have only heard about genlocking/framelocking media servers when multiple outputs are used to feed the same led wall, but never in the context of keeping in sync with the timecode or with audio.
I was reading lately about the use of timecode in movie production. I was surprised to learn that the external timecode does not actually synchronize cameras; it only gives the starting time. Once the camera is rolling, frames are counted up from that starting point onwards; purely based on the recording speed of the camera, without any influence of the timecode. So, on longer takes, cameras may drift apart from each other. The same goes for external audio recording, where the starting timecode is put as metadata into the file; but once its recording, its free running.
To solve this issue, movie productions do not just use timecode for longer takes, they also genlock the cameras to be sure they run at exactly the same speed. Audio recorders are then feed a wordclock signal coming from the same source, so that everything stays in sync.
That got me thinking about media servers. I have the impression, that the same basic problem exists. So, a media server can start a video clip at exactly the right timecode, but once it is running, it comes down to the playback speed of the graphics card. For longer video clips, you would probably have to genlock the media servers and the timecode generator to insure sync. Is this ever done? So far, I have only heard about genlocking/framelocking media servers when multiple outputs are used to feed the same led wall, but never in the context of keeping in sync with the timecode or with audio.