This is the first post in a series of articles on synchronization and SMPTE written by lighting designer Roman Vakulyuk and translated into English:
Any show or production can be divided into the content parts that make it up. For concerts, it would be musical numbers, and for the theater, scenes or acts. In either case, the parts have a beginning and an end and run for a certain amount of time. Inside each of the sections is a set of changes to the lighting, sound, stage machinery, etc. One of the ways to synchronize all of those systems is time synchronization. For example, on a lighting console, a cue is created with an exact start time specified in its settings. At the exact moment ascribed to that cue, the lighting desk will launch the light sequence. It takes a special language – or ‘Time Code’ – in order for all the devices to “understand” what moment it is in a light show.
Time Code came to the show industry from the broadcast industry. It’s called SMPTE, an abbreviation of the Society of Motion Picture and Television Engineers. This association accepted a single time-code standard in 1971. Later, the EBU (European Broadcasting Union) joined them.
Years ago, sound for television was written on magnetic tape with several sound tracks. One of these audio tracks contained Time Code, where the time code was written in the analog format. This was necessary, so that during editing, the audio and video could be synchronized. This type of SMPTE recording is called Linear (or Longitudinal) Time Code (LTC). It’s called that, because the information is written sequentially, step by step in this format.
SMPTE is time that starts at zero. Just like in real time, SMPTE has hours, minutes and seconds. SMPTE’s maximum value is 24 hours, like there are in a day. But there is a difference from our typical imperial time: frames. The terminology came from the broadcast industry. Just like a video has a certain number of frames per second, time code is defined by video frames. But the number of frames can vary, depending on the video format. These differences are applicable to SMPTE, because it also varies. There are several types of SMPTE: 24 frames per second (fps), 25fps, 30fps, and 29.9fps (or 30fps drop). All of these determinations were dependent on the state of the broadcast standard: PAL, SECAM, NTSC, which in turn are based on 50Hz and 60Hz electrical frequencies.
The basic content unit in LTC is a data block that transmits each frame in real time, therefore it’s called a SMPTE frame. A system called Bi-Phase Mark is used to code a bit into an LTC signal. The zeros are coded by a single phase change at a regular interval, and the ones are coded by two phase changes (one at that same interval and the other at twice the rate). An LTC frame is 80 bits long. The structure of the frame is shown in this diagram.
SMPTE time is coded by the BCD (Binary Coded Decimal) method. In this method, four bits are allocated for every tenth decimal numeral. In each frame, the first 26 bits are allocated to the time information, followed by 32 bits of user data, and concluded by the synchronization word (the last 16 bits). The synchronization word is used to identify the frame’s boundaries and is always set to: 0011 1111 1111 1101.
After time, SMPTE was written in different formats – VITC, CTL, BITC, Keykode – even in the broadcast industry. But the analog format still remains.
Now I’ll talk about a different format for working with SMPTE. The interface came from the music world: MIDI (Musical Instrument Digital Interface). This is an expansive interface for working with different data formats. But now we’ll talk about the use of the format for work with time, which we call MTC (MIDI Time Code).
The first difference between MTC and LTC is that MTC is a completely digital format and is coded in a hexadecimal number system. The second difference is that the format is never recorded in a storage medium like LTC. It’s generated under hardware or software control. The third difference in which MTC differs from LTC is that one SMPTE frame is divided into eight parts, and one MTC message is sent every quarter frame.
We’ll explore that third difference a bit more. We’re not going to go too deeply into the expansive theory of MIDI. We’ll only touch on the basic functions of MTC.
The MIDI Time Code message is made up of 32 bits, of which only 24 are used. The remaining eight aren’t used or are always equal to zero. Each Time Code component is coded by one byte.
0rrhhhhh: Rate (0–3) and hour (0–23).
rr = 00: 24 fps
rr = 01: 25 fps
rr = 10: 29.99 fps
rr = 11: 30 fps
00mmmmmm: Minute (0–59)
00ssssss: Second (0–59)
000fffff: Frame (0–30)
When Time Code is sequentially repeated, the 32-bit time code is divided into eight four-bit parts and one part is transmitted by every quarter frame. Each quarter frame contains the byte status 0xF1, and succeeding it is seven data bits, of which three bits identify the quarter frame and four bits contain the time portion.
Eight MIDI Time Code messages from one frame
|Piece #||Data byte||Significance|
|0||0000 ffff||Frame number lsbits|
|1||0001 000f||Frame number msbit|
|2||0010 ssss||Second lsbits|
|3||0011 00ss||Second msbits|
|4||0100 mmmm||Minute lsbits|
|5||0101 00mm||Minute msbits|
|6||0110 hhhh||Hour lsbits|
|7||0111 0rrh||Rate and hour msbit|
This way, the time is expanded with the transmission of one SMPTE frame, taking place over two frames (8 x ¼). That means that while mapping SMPTE to MTC, only every second frame ends up in MIDI Time Code. This somewhat slows the reaction pace of the master device. In order for it to react, it needs to “read” eight quarter-frame messages. In real time, this takes maybe from two to four frames, depending on when the scan starts.
Now, a few nuances. In SMPTE (or LTC), the receiver counts the moment it receives the first bit of an 80-bit packet as the start of the frame. In MTC, the start of the frame is counted as the moment of arrival of the first and fifth quarter-frame message in a series, or 0xF1 0x0n and 0xF1 0x4n. But the time can be read only after the receipt of all eight messages in a series. Up to that moment, the time value is received two frames late. To visualize it on the real-time display, the master device adds to the calculated value a modification to these two frames. In conjunction, the receiving device itself fills in the missing frames.
LTC and MTC are the basic formats for work with time synchronization. Time Code can also be sent over ArtNet and MSC. But today, there are few devices that work with Time Code over these protocols.
The information above should be enough to start dabbling in the framework and features of SMPTE.
Award-winning lighting designer Roman Vakulyuk has always been interested in electronics, and he knew as a child what he wanted to do when he grew up. In school, he enjoyed programming and wrote several computer programs. Roman fell in love with music in college and moonlighted as a DJ in clubs. One of the clubs had a professional lighting system, and one night when there was no one at the lighting desk, he decided to give it a try. It was an important moment in his life, because he became infatuated with lighting and worked to become a lighting designer. After immersing himself in the profession, his skills advanced and Roman began to contemplate how to create more dramatic shows. He found out that there was such a thing as synchronization technology that could synchronize all show elements, and he began to research it. Over the years his light shows got better and better, and he began to use synchronization on his own projects. “Now I know so much more than the first time I sat behind a lighting console, but I haven’t stopped learning and progressing, because knowledge is the best path forward,” explains Roman. Currently he works with Global Show Trade (GST), a Moscow-based production company, education center and product showroom. Some of his recent projects include the Bilan 35 concert, the Counter-Strike: Global Offensive cyber tournament opening ceremonies, and the 2015 Circle of Light Festival.