Do I Need 120 Hertz HDMI Cables?
One of the most common sorts of questions from our customers these days is some variant on this: "Do I really need a 120Hz HDMI Cable?" In consumer electronics stores across the country, consumers are being told that their new 120 Hertz displays will not work properly, or will not work optimally, without a cable designed for 120 Hertz. Let's address this question two different ways, beginning with the short answer and following with the long answer/explanation:
Q. Do I need a 120 Hz HDMI cable?
The Short Answer:
A. No. In fact, there's no such thing.
The Long Answer...
All right, here's the long answer. We'll start by talking about this whole "Hertz" thing to make sure we are all speaking the same language.
What the Heck is Hertz?
The Hertz is a unit of frequency, named after Heinrich Hertz, one of the pioneers of radio, who discovered what were once called "Hertzian waves." A Hertzian wave is a wave of electrical energy, and it can propagate invisibly through the air--it is, in other words, what we now call a radio wave. Hertzian waves alternate in direction, with the charge rapidly oscillating from positive to negative and back again. The unit "Hertz" is the measure of how often these waves cycle through their whole positive/negative swing, and this unit is used to measure not only radio waves, but any periodic wave, like the current in your electrical supply lines. In America, the power runs at 60 Hertz, so if you could watch the voltage on your incoming power, you'd see it swing up and down sixty times every second.
Hertz and the Golden Age of Television
Television originated in a non-solid-state, analog world. Anyone of a certain age in America remembers going down to the store with the parents and using a tube tester to try to figure out which of the several vacuum tubes in the television set had blown, because televisions relied upon a series of vacuum tubes to do all the work of bringing in a signal, extracting the video and audio, and delivering that to the screen and speakers. In this simpler, analog world, with no transistors to make electronics cheaper and more compact, one challenge in receiving a television signal was to figure out how to synchronize the television receiver to the television transmitter. If the television transmitter sent out a certain number of frames of video per second, and the television receiver was not running at quite the same speed--say, the transmitter was running 30 frames per second and the receiver was running at 29 and a half, the result was a mess. But a highly stable oscillator circuit fixed tightly to a reference frequency was not a cheap thing to include in every television set.
The solution to this problem was to use, as a frequency reference, the one rock-solid reliable reference frequency which every television owner had in the home: the AC line current. On each full cycle of the AC line, a television transmitter would send out one field of video, and every television set was designed to use that reference frequency and look for video to come in at just that rate. Since sixty frames per second was more than was needed to generate smooth action, the standard was made "interlaced," with one frame of video being composed of two fields, one of the "odd" lines and one of the even. Sixty Hertz current became sixty fields of video, or thirty frames per second.
Because the Hertz is a handy unit of frequency, it came to be applied to other phenomena with regular frequency, such as the rate at which frames or fields are fed to a television monitor. Accordingly, televisions, computer monitors, and the like often are said to have a refresh rate of some number of Hertz-- 30 Hertz, 60 Hertz, 72 Hertz, and so on.
Refreshing Crystal Light - No, Not the Drink
In that not-so-distant-past analog world of which we spoke above, the 60 Hertz wall current was used to time the movements of magnetic fields within a Cathode Ray Tube -- CRT for short -- which dragged a beam of electrons across our television screens to illuminate phosphors, making a television picture. But today, of course, there are new display technologies. One of these which is particularly relevant to our discussion here is the LCD, or Liquid Crystal Display.
In an LCD screen, instead of scan lines like we see on a CRT, there are individual, separately addressable pixels which are colored and lit by delivering electrical charges to them. LCD is, of course, not the only display that works in this fashion, the other principal type on the home theater market being the plasma screen.
Whether or not you've been around on this earth long enough to have replaced the vacuum tubes in your television set, you've certainly been around long enough to remember that when LCD monitors for home theater use began to hit the market, they were plagued with one very serious drawback: noticeable "latency." LCDs could not, for a couple of reasons, respond quickly to image changes, and the result was that in fast-changing areas of screen, transitions were slowed down; there might be a blur, for example, behind a moving object as the image of the object was being replaced by the image of the background. LCDs have of course become much better today than they once were in this respect. The latency issue is the reason you're not seeing a lot of non-LCD 120 Hertz displays--a plasma screen could be refreshed 120 times per second, too, but there's no compelling reason to do so.
One of the methods LCD manufacturers have used to defeat latency (and to do other things, too--notably, to aid in rendering 24 frame-per-second film) is to refresh the screen more frequently. The more overwriting, the less latency. Because existing American video sources generally run at 30 frames per second or at 60 frames per second, and because film is usually shot at 24 frames per second, 120 frames per second, being a multiple of each of these, makes a sensible choice for this faster refresh rate because it allows each frame of any of those video sources to be repeated on the screen a fixed number of times. Whether the display is being fed at 24, 30, or 60 frames per second, it can simply multiply the frames, refresh at the same 120-frame rate, and reduce latency in the image. And, because the refresh rate of a display is often spoken of in "Hertz," this gave rise to the "120 Hertz" display.
But What About the Source and the Cable?
The fact that the refresh rate of the display is 120 Hertz may be, for the reasons above, a great thing--less latency, smoother pulldown of 24 fps sources. But it does give rise to a misconception about how the system works, and this misconception is being used by some, unfortunately, to market high-priced HDMI cables.
Anybody familiar with HDMI will know that the bandwidth demand placed upon the cable is a function of the bitrate flowing through that cable. The bitrate, in turn, is a function of the resolution, frame rate, and color depth of the picture. The argument here is obvious enough: 120Hz signal has double the frame rate of 60Hz signal, and therefore needs double the bandwidth. That seems simple and straightforward enough, and it would be true, but for one thing: It's NOT. It simply makes a critical, incorrect, unstated assumption.
The incorrect assumption here is that the new doubled refresh rate is transmitted over the cable. It's not. Your cable needs to handle the frame rate which passes through the cable, but it doesn't care what the frame rate at any other point in the process is. If the cable is carrying a 60 Hz frame rate, and the display doubles that to 120 Hz to refresh the screen twice as often, your cable only "sees" 60, not 120. The bandwidth demand placed on the cable has to do with the signal coming from the source and into the display--what the display may do with that signal internally, after it has passed through the cable, has nothing to do with the load on the cable. Nobody feeds video at 120 Hz, because it doesn't make any sense to do so--when the original content is not recorded at 120 frames per second, there's no gain to be had in sending each frame multiple times to the display, and it would make the sending and receiving chipsets costlier while making the whole interface less reliable due to the increased bandwidth demand placed on the cable. In fact, most (perhaps all) of the "120 Hertz" displays on the market cannot and will not accept an input signal with a 120 Hertz frame rate. Read that last sentence twice if you're still confused.
Whether your display's internal refresh rate is 120 Hertz or some other rate, the signals coming in are running at frame rates determined by the sources of those signals. This typically means 30 Hertz for interlaced formats like 1080i, 60 Hertz for progressive formats like 720p or 480p, or 24 Hertz for certain players that support 1080p/24. Those signal frame rates, not your display's internal refresh rate, are what your cable must handle. If a salesman is trying to push that monstrously expensive "120Hz HDMI cable" into your hands--probably at a buck or more per Hertz - it's time to keep your wallet in your pocket.
by Kurt Denke
President, Blue Jeans Cable