Welcome to Pixels & Bits, where the staff at Gaming Nexus will take a weekly look at the impact of audio and video products (as well as related gear) that enhances the gaming experience. In this serialized article, we will discuss audio and video products, accessories and opinions on how these work within the confines of the gaming experience. In this week’s article, Sean covers the differences of refresh rates on televisions and how they impact gaming.
There are so many factors that one should consider when it comes purchasing that new HDTV, it’s almost impossible to list them all without having to spend a lot of time thinking about it. Over the past couple of years, the list has dwindled a bit thanks to the innovations of the industry. While most people argue over the types of televisions that deliver the best picture (stay tuned for an article on that), one of the most important aspects that a gamer has to take into consideration is refresh rate.
What exactly is a refresh rate?
The definition of a refresh rate is the number of times in a second that a display’s hardware can draw data. In common speak, refresh rate is ultimately how quickly a pixel on a television can change what it displays depending on motion, color, and action.
What are the most common refresh rates?
In today’s world of televisions, there are five that will come into play: 60Hz, 120Hz, 240Hz, 480Hz, and 600Hz. To simplify things a bit further, 60Hz and 480Hz are becoming few and far between, just due to the way the industry has moved. LCD (Liquid Crystal Display) and LED (Light-emitting diode) televisions are primarily in the 120-240Hz range while Plasma technology is almost entirely at 600Hz in today’s models, though a couple of low end Plasmas still have 480Hz as their listed refresh rate.
Breaking down the actual refresh rates.
To know just what a gamer is getting him or herself into when purchasing a television, it’s best to set a specific size and type of television that one wants before getting into refresh rates, simply because these rates are sometimes available only in larger sizes. While the entire subject can seem a bit daunting, the breakdown can makes things much easier on those who need the information.
This is the lowest level that one will find in today’s industry. 60Hz, when LCD televisions first came out in mass production, was seen as a staple for everyday use. 60Hz actually goes back way before the advent of HDTVs. Producers going as far back as the 1930s really didn’t have much of a choice due to the limitations of tube-style televisions. At this refresh rate, motion on televisions move just as they did before higher resolutions came into play. Truthfully, when 120Hz first launched, many people complained that the images didn’t seem normal because our minds were so used to the way a television looked at 60Hz. Simply put, when it comes to gaming, 60Hz will give a decent picture, but this refresh rate on LCDs will suffer from the dreaded motion blur, which means that pixels cannot catch up to the amount of data that is being requested because of motions that are too fast. FPS games will cause motion blur the most, and while low level LCDs are rough with this, the now hard to find DLP technology was also quite brutal with motion blur. While DLP (Digital Light Processing, basically projectors) is only found in high end projector units, they are only capable of this motion blur unless dropping a lot of money.
120Hz hit the consumer market in the mid-2000s as the next step in the evolution of LCD televisions. 120Hz, as previously mentioned, initially had a lot of pushback because of the unnatural feel of the images being displayed. When an image scrolls fast enough, the smoothing kicks in and almost changes the way the picture actually looks. In gaming, 120Hz definitely helps as opposed to 60Hz, but can still suffer from some motion blur on very quick motions. Most LCDs now have this as the default refresh rate and can actually be turned off to help with those who don’t like the look of it. When LEDs hit the market, this refresh rate was the set standard for the technology before the evolved to the next level. Most gamers will be comfortable with this level, especially with the crisp picture of an LED television.
I really feel that, if someone truly wants to have the best picture quality with no issues in the LCD and LED technology, this is the refresh rate that is most necessary. 240Hz basically takes the guessing game out of the equation. With the ability to drop down to any of the lower refresh rates, 240Hz televisions have incredible flexibility with their picture quality. Gaming sessions will have zero issue with motion blur at this speed and give one of the most accurate pictures that a gamer would want. Essentially, at this refresh rate, pixels are able to refresh themselves at the highest speed and keep up with the image. Side scrolls, panning, and quick motions see zero loss on the display.
This refresh rate is actually older than 120Hz and 240Hz, even though it’s a higher refresh rate. 480Hz was what Plasma technology boasted for most of the 2000s before getting the kick up to its current standard of 600Hz. Plasmas are designed to suffer from zero motion blur since it is not limited by liquid crystal technology. Because of this, Plasmas are able to have those higher refresh rates, yet maintain an image that doesn’t look unnatural to the human eye. LED TVs are just now coming out in this refresh rate as well, in an attempt to eliminate the low level refresh rates, simply because at this point in time, they are obsolete.
This is the current standard for just about every Plasma TV on the market. Plasmas still give the most accurate black levels and, also, suffer from no motion blur at all, as expected. The difference in the high end LCD/LED TVs and Plasmas is that Plasmas are really only capable of running at this refresh rate and nothing else. The TVs basically adjust to the image on the screen and will compensate to make sure that the image remains normal. Motion blur, of course, has no effect on a 600Hz refresh rate television. Even going into larger sizes (Most Plasmas can be found at 42” screen sizes and larger only) does not effect this, though going to large on the image can distort the general quality of the picture, as we’ve talked about before in previous Pixels and Bits articles.
So what does it all mean?
Simply put, the higher the refresh rate, the better off a gamer is going to be when it comes to the quality of play. Granted, some games do not need to worry about refresh rates as much as other. Role-playing titles, for example, are more than likely going to look the same on a 60Hz against a 600Hz television. Those who are into action and FPS titles are going to benefit the most from the higher rates. Image accuracy is a huge deal in these titles as well, so being able to pan quickly while trying to find an enemy is paramount, making the higher refresh rate that much more important. Depending on the budget of a consumer, one can find the refresh rate necessary for their budge. Plasma technology is actually much cheaper than LEDs nowadays and can give that quick response necessary without suffering on picture quality, but with LEDs starting to dominate the market with the brightness, sharpness, and clarity of the image, one might be wise to save up the extra money to get that next-level LED with the top refresh rate possible. Either way, the options are there at most any price range and size, but you get what you pay for, as the old saying goes. Make sure that this is an option you do not skimp on and get the higher refresh rate.
About the author
Sean Cahill has been on staff at Gaming Nexus since 2007. He specializes in console gaming, primarily Xbox 360, as well as PC hardware, A/V, and car audio accessories with ten years of experience. If you have a question or comment for Sean, please refer to the comments section below.
Page 2 of 1