![]()
![]()
|
Understanding Analog Video Signals
![]() Figure 1. The Hierarchy of Video Native Primaries The first line in the hierarchy is R', G', B', where the prime mark (') indicates gamma ( ![]() ![]() The bandwidth of the signals RGB and R'G'B' are equal and determined by the video resolution9. This is as good as it gets. Any further signal processing degrades the video quality, which is why graphics stuck with RGB. A viewer may not perceive this degradation if human vision or the display can't resolve it. Broadcast used human perception factors to design the composite signal for TV. HDTV, PAL plus, and MPEG all later rejected composite and native primaries, and decided to use the next form — called component video — to improve video quality. Component Video The third and fourth lines are the two forms of component video, color difference (Y'PbPr/Y'UV/Y'IQ) and Luma-Chroma (Y'-C)10. Sometimes, there is confusion about the terms used. Some texts use the terms Luminance and Chrominance, which are from Color Science. Here we'll use Luma and Chroma where the Luma term is written with a prime (Y') to indicate the non-linear video form. The Color Difference form is produced by the linear addition and scaling of R'G'B' to implement the well-know equations; Y' = (KrThe coefficients for Luma (Kr,Kg,Kb), are the same for NTSC, PAL, and SECAM, but the coefficients for the difference terms (Kcr and Kcb) vary according to the process. It is important to remember that the equations apply to the active video portion of the signal and not the sync. They must be separated prior to this process, and combine them again afterwards. One of the challenges with multiple video signals is that of controlling delay. In order to display an image, the video voltages must be correctly aligned in time. Two types of delay prevent this, flat delay caused by the transmission path length and frequency-dependent delay caused by filters. This applies to R'G'B' and component video. Flat delay is seldom a problem at video frequencies, and any required compensation can be made either by coax cable or delay lines. Frequency-dependent delay is another matter. Because the R', G', and B' signals all have the same bandwidth, flat delay is seldom a problem, but the Chroma portions of the component signals (Pb, Pr & C) are filtered to reduce the occupied bandwidth. To compensate for the delay associated with this filtering, the Luma signal (Y) must be delayed the same amount. The Chroma filtering is considered "visually lossless", based on a model of human vision that says the eye doesn't detect small details in color. The analog videotape format of Beta11 is an example of a scaled color-difference format, and S-VHS12 is an example of the Y-C form. MPEG uses a digitized form of the color-difference signals, designated YCbCr, and shown on the seventh line, where the bandwidth reduction is done by sampling Cb and Cr at half the rate of the Y channel. This is called 4:2:2 sampling, and is based on ITU-R BT.601. The Y-C component form is produced by phase- or frequency- modulating color subcarrier(s) with the color-difference components, and then adding them together depending on which process is used. The Y channel is the same as in YPbPr, but the Chroma signal is an FM or PM subcarrier that is band-pass filtered, further truncating the color bandwidth. This is an important point in the encoding process. It's the last place where Luma and Chroma information are separate. Once Y and C are combined, they will never again be totally separated, and that produces the artifacts that give composite its reputation for compromised quality. Composite Video The fifth, and center, line is composite video (Cvbs), formed by adding the Luma and Chroma components together with monaural audio, NTSC, PAL and SECAM are composite video signals. The Cvbs signal is the lowest quality video on the chart and suffers from cross-color artifacts. These are bits and pieces of Chroma and Luma information that remain after we try to separate Cvbs back into R',G', & B' for display. These artifacts became more noticible as broadcast began to use larger, higher-quality displays. Today, Cvbs is more of a legacy format, and will probably disappear as single-wire digital forms of component video take its place. One odd thing about NTSC Cvbs is something called "setup." This is a voltage offset between the "black" and "blanking" levels, and is unique to NTSC. As a result, NTSC is more easily separated from its sync portion, but has a smaller dynamic range when compared with PAL or SECAM. The Video formats are also called color spaces in digital literature, and the encoding/decoding process is called color-space conversion to distinguish it from the analog process. Don't be confused by this — digital video uses the same formats as analog video. The signals produced by the encoding process are shown in Figure 2, along with approximate amplitudes, in percent. Exact amplitudes are given in Table 1 for several of the formats, based on a 1V p-p R'G'B' set of native primaries, across a 75 ![]() ![]() Figure 2. Analog encoding from R'G'B' to Cvbs Table 1. Standard Video Voltages
Linear and Gamma Corrected Video Originally, video signals were created in cameras using vacuum tube sensors. The output voltage (V) of a tube camera isn't linear in relation to the incident light (B). It's exponential and the exponent is called gamma ( ![]() B = Kwhere B is light flux, in lumens per square meter. K is a constant; and V is the voltage generated, in Volts. Since the CRT is also a vacuum tube, with inverse non-linearity (1/ ![]() A beneficial side effect of gamma is that it reduces the effect of additive noise. Gamma is specified as 2.22 for NTSC and as 2.8 for PAL and SECAM. Originally, the camera and CRT were thought to be exactly complementary, but they are not. Later it was found that intentionally under-compensating for ![]() ![]() One thing is certain. You'll need to be able to add, remove, or change ![]() ![]() ![]() ![]() Gamma Modification The addition, removal, or change of ![]() ![]() ![]() ![]() The digital process uses substituted values from a look-up table (LUT) stored in software. It's as accurate as the stored value, and trivial in terms of its design. Obviously if the signal is digital, this is the preferred method to use. In either case, we need a formula for the voltage in terms of the light flux (B). Broadcast video has two, one used for standard-definition TV (SDTV), and another for HDTV. For NTSC/PAL per SMPTE170M and ITU-R BT.709; E'x = [(1.099 ![]() E'x = [4.5 ![]() For HDTV per SMPTE240M; E'x = [(1.1115 ![]() E'x = [4.0 ![]() Scanning and Sync Video signals have two parts: the active video and sync. We have so far only looked at the active video. The proper name for sync is Image Reconstruction Timing, and it's used to reconstitute the image. The sync portion doesn't interfere with the active video because it's below the black level and can't be seen. Any signal below the black level is said to be blanked. The black and blanking level are the same in every format except NTSC composite. Originally, the black, or blanking level was at 0Volts, with active video above and sync below, to simplify separating them based on level and timing. If you could spread out the active video and sync interval on a flat surface, you would get a raster, which looks like Figure 3. The unused portion, T2(H) to T3(H), originally allowed magnetically-scanned CRTs to "fly back" to their starting point for the next line, and settle during T0(H) to T1(H). The vertical deflection works in a similar manner. The sync interval is "dead time" as far as the active video is concerned. Consequently, there are two resolutions for a video format, the active-video resolution we see, and the total resolution13 of the raster. This is true for both broadcast and graphics. The image quality is a function of the active-video resolution14 and the bandwidth through which the signal is transmitted. A raster is created by scanning, both horizontally and vertically, starting at the upper left corner of the display. These "scan lines" are synchronized by the horizontal sync pulse, or H-Sync, so that they all start at the same place on the display. The frame, or V-Sync, indicates when the scan is finished and when to start the next. This means the image is sampled at the frame rate, and any motion that's faster than 1/2V-Sync will produce "aliasing" in the reconstructed image. ![]() Figure 3. Display raster with horizontal and vertical flyback time In RS-170, the frame rate was split into odd and even fields — a process called interlaced scanning — to conserve bandwidth. Visually this has the effect of re-sampling the displayed image faster and avoids flicker without increasing the frame rate — and bandwidth — in broadcast. The addition of a color subcarrier modified this sequence. In NTSC, the phase of the color subcarrier reverses every field, and in PAL, it indexes 90° per field. This gives rise to the 4, and 8 color field sequences for the NTSC and PAL composite signals. Graphics uses progressive scanning, since the increased bandwidth isn't a problem. A side effect of the vertical sampling is that if you AC couple a video signal, you must still have good square-wave response at the field (broadcast), or frame rate (graphics). If you don't, you'll get brightness variations across the raster. This can be seen in a vertically-split black and white screen pattern. Very large capacitors (>330µF) are required to maintain good square-wave response when AC coupling an output because of the 75 ![]() Scan Conversion The scanning method and rate varies between the different types of video. In order to share a display, the Multi-Sync™ concept was invented. Originally, these displays had a deflection system that could respond to the different rates by switching component values. As long as the display had sufficient resolution to display the highest scan rates, this worked fine. It displayed each type of video with its native scanning format, but this can be expensive since the display must be sized to the highest resolution and speed. The alternative is to scan the display at a constant rate, and convert the incoming video to the display rate. This is called scan conversion. It allows the display to operate at a single resolution, making the deflection simpler. Scan conversion is best done in the digital domain using dual-ported video RAM. Video Groups and Specifications NTSC: National Television System Committee. The US form of standard definition TV. PAL: Phase Alternating Line. The system of standard definition TV implemented in Europe and elsewhere. SECAM: Sequential Couleur avec Memoire. The French form of standard definition TV. ATSC: Advanced Television Systems Committee. The US form of high definition TV (HDTV). VESA: Video Electronics Standards Association. Proposes and publishes video standards for Graphics. ITU: International Telecommunications Union. Proposes and publishes video standards for Broadcast in the EU. SMPTE: Society of Motion Picture and TV Engineers. Proposes and publishes video standards for Broadcast in the US. JPEG: Joint Photographic Experts Group. Proposes and publishes video standards for Still Images. MPEG: Motion Picture Experts Group. Proposes and publishes video standards for Broadcast. EIA RS 170 & 170A The original specs for Monochrome and Color TV in the US. Has been replaced by SMPTE 170M. EIA 770-1: The US spec for Enhanced Component video, similar to ITU-R BT1197/ETSI 300 294 for PAL-Plus. EIA 770-2: The US specs for Standard Definition TV (SDTV) Baseband Component Video. EIA 770-3: The US spec for High Definition TV (HDTV) Baseband Video. ITU-R BT.470: Harmonized spec for SDTV world wide, including NTSC, PAL, and SECAM. ITU-R BT.601: Universal Sampling spec for SDTV and HDTV Broadcast Video. Similar to SMPTE125M. ITU-R BT1197/ETSI 300 294: Spec for PAL Plus Enhanced TV in Europe. SMPTE 125M: Similar to ITU-R BT.601. SMPTE 170M: Has replaced EIA RS 170A, color spec for NTSC. SMPTE 253M: RGB Analog Video Interface spec for SDTV Studio applications. SMPTE 274M: Component spec for 1920x1080 HDTV. SMPTE 296M: Spec for 1280 x 720 RGB and YPbPr Baseband Video. Similar to PAL Plus. Table 2. Graphic Standards and Active Resolutions
Choosing A Video IC Tables 3 and 4 show large-signal bandwidth (2Vp-p), slew rate, differential gain and phase, and supply voltage for Maxim's most popular video drivers, buffers, and receivers with single-ended and differential outputs. A special subset of the video driver is the video-distribution amplifier (see Table 5). Built to drive multiple loads, they offer higher isolation, selectable outputs, fixed or settable gain and are often used in professional equipment. Another subset of the video driver is the video mux-amp (see Table 6). Mux-amps combine a video multiplexer and a video line driver for routing video signals. Analog Video Filters maybe used to eliminate many discrete components and save board space in video reconstruction applications(see Table 7.) Table 3: Single-Ended Video Line Drivers and Buffers
Table 4: Differential Video Line Drivers and Receivers
Table 5: Distribution Amplifiers
Table 6: Video Mux-Amps
Table 7. Video Reconstruction Filters
Table 8. SCART Audio/Video Switches
Notes: 1. RS-170 was replaced by SMPTE 170M. 2. Cvbs usually means "composite video, with blanking and sound." 3. The native form is that in which the signal was created. Usually it is R'G'B', the g-corrected primaries. 4. NTSC is the National Television Systems Committee system of analog encoding. 5. PAL is the Phase Alternating Line system of analog encoding. 6. SECAM is the Sequential Couleur avec Memoire system of analog encoding. 7. Bandwidth versus video resolution 8. The exact form and process information for Terrestrial Broadcast can be found in ITU-R BT.470. 9. Bandwidth versus Video Resolution 10. The Y Component is often called "Luminance," and confused with the color science term. We use the term Luma, and designate it with an accent, Y'. 11. Trademark of Sony Corp. 12. Trademark of JVC. 13. Total resolution is also called format resolution. 14. Bandwidth versus Video Resolution 15. This is the Nyquist frequency of the image-sampling process. September 2002
|
<script language=JavaScript1.1>
</script>
![]()
![]() |

<script> // </script> <script language=javascript> var _hbEC=0,_hbE=new Array;function _hbEvent(a,b){b=_hbE[_hbEC++]=new Object();b._N=a;b._C=0;return b;} var hbx=_hbEvent("pv");hbx.vpc="HBX0100u";hbx.gn="ehg-maxim.hitbox.com"; var cmpval = ""; var query = location.search.substring(1); var pairs = query.split("&"); for (var i = 0; i < pairs.length; i++) { var pos = pairs[i].indexOf('='); if (pos == -1) continue; var argname = pairs[i].substring(0,pos); var value = pairs[i].substring(pos+1); if (argname == "CMP") { cmpval = value; }; }; //BEGIN EDITABLE SECTION //CONFIGURATION VARIABLES hbx.acct="DM5307038OSA96EN3";//ACCOUNT NUMBER(S) var illegal_pn="Tutorial+-+Understanding+Analog+Video+Signals+-AV-+"; var clean_pn=illegal_pn.replace(/µ/g, "u"); //replace micro var clean_pn=clean_pn.replace(/®/g, "(R)"); //replace register sym var clean_pn=clean_pn.replace(/&/g, "+and+"); //replace ampersand var clean_pn=clean_pn.replace(/&#/d/d/d;/g, ""); //replace any html code var clean_pn=clean_pn.replace(/<.sub>/g, ""); //replace subscript var clean_pn=clean_pn.replace(/ /g, ""); //replace closing subscript var clean_pn=clean_pn.replace(/['"#$*!<>~;]/g, ""); var clean_pn=clean_pn.replace(/:/g, "+-"); var clean_pn=clean_pn.replace(/&/g, "+and+"); var clean_pn=clean_pn.replace(/%/g, "+percent"); hbx.pn=clean_pn;//PAGE NAME(S) hbx.pn=hbx.pn.concat(cmpval); hbx.mlc="/AppNotes";//MULTI-LEVEL CONTENT CATEGORY hbx.pndef="title";//DEFAULT PAGE NAME hbx.ctdef="full";//DEFAULT CONTENT CATEGORY //OPTIONAL PAGE VARIABLES //ACTION SETTINGS hbx.fv="";//FORM VALIDATION MINIMUM ELEMENTS OR SUBMIT FUNCTION NAME hbx.lt="auto_pos";//LINK TRACKING hbx.dlf="n";//DOWNLOAD FILTER hbx.dft="n";//DOWNLOAD FILE NAMING hbx.elf="";//EXIT LINK FILTER //SEGMENTS AND FUNNELS hbx.seg="";//VISITOR SEGMENTATION hbx.fnl="(3785,4)";//FUNNELS //CAMPAIGNS hbx.cmp="";//CAMPAIGN ID hbx.cmpn="";//CAMPAIGN ID IN QUERY hbx.dcmp="";//DYNAMIC CAMPAIGN ID hbx.dcmpn="";//DYNAMIC CAMPAIGN ID IN QUERY hbx.dcmpe="";//DYNAMIC CAMPAIGN EXPIRATION hbx.dcmpre="";//DYNAMIC CAMPAIGN RESPONSE EXPIRATION hbx.hra="";//RESPONSE ATTRIBUTE hbx.hqsr="";//RESPONSE ATTRIBUTE IN REFERRAL QUERY hbx.hqsp="";//RESPONSE ATTRIBUTE IN QUERY hbx.hlt="";//LEAD TRACKING hbx.hla="";//LEAD ATTRIBUTE hbx.gp="";//CAMPAIGN GOAL hbx.gpn="";//CAMPAIGN GOAL IN QUERY hbx.hcn="";//CONVERSION ATTRIBUTE hbx.hcv="";//CONVERSION VALUE hbx.cp="null";//LEGACY CAMPAIGN hbx.cpd="";//CAMPAIGN DOMAIN //CUSTOM VARIABLES hbx.ci="";//CUSTOMER ID hbx.hc1="";//CUSTOM 1 hbx.hc2="";//CUSTOM 2 hbx.hc3="";//CUSTOM 3 hbx.hc4="";//CUSTOM 4 hbx.hrf="";//CUSTOM REFERRER hbx.pec="";//ERROR CODES //INSERT CUSTOM EVENTS //END EDITABLE SECTION //REQUIRED SECTION. CHANGE "YOURSERVER" TO VALID LOCATION ON YOUR WEB SERVER (HTTPS IF FROM SECURE SERVER) </script> <script language=javascript1.1 src="http://www.maxim-ic.com/js/hbx.js" defer></script>