Navigating the wide world of monitor and display technologies just seems to get more difficult as time and technology progresses. You've likely seen people touting the benefits of higher refresh rate monitors for a while now, and even Sony and Microsoft have been getting on the high refresh rate bandwagon with their latest consoles.
On top of having to be aware of a litany of different resolutions and HDR technologies, most companies also expect you to understand and know which refresh rate you're after. So let's break down exactly what the difference is between two of the major refresh rates you're going to see out there. But first, some terminology.
What is a refresh rate?
A display's refresh rate is simply how many times a display (TV or monitor) refreshes its image per second, measured in hertz. So a 60 Hz monitor will refresh the image on screen 60 times per second. This measurement of how often the screen refreshes its own image is separate from how often the source (a game console or PC) sends that information. The screen always refreshes itself with whatever the latest information its source has given it is.
Related: 2021’s 6 Best 4K Gaming Monitors
What are AMD FreeSync and Nvidia G-Sync?
Two major technologies to be aware of when it comes to refresh rate (hertz) are AMD FreeSync and Nvidia G-Sync. Both of the are names for a specific technology from their respective companies that do the same thing: maintain variable refresh rates. A variable refresh rate just means that a display has the ability to raise (up to its maximum rating in hertz) or lower its refresh rate to match the changing frame rate of whatever content is being fed to it. In other words, when the game you're playing dips to 60 FPS your 144 Hz monitor can lower its refresh rate to match as long as it has a variable refresh rate technology.
It might sound counterintuitive to lower your refresh rate, but it's built to solve the problem of screen tearing. Screen tearing is a very distracting issue that occurs when your refresh rate is not in sync with the frame rate of the content being fed to it. The end result is a very distracting line running through the image that often looks like two images split in the middle. You've likely noticed this before in a game without realizing what it was.
Previously this was solved by a technology called v-sync, which is usually found in the settings menu of PC games. The issue with v-sync is that it introduces a small amount of lag to maintain visual fidelity, which is a deal breaker for people who enjoy esports or competitive titles. Depending on the implementation it can also cause your game to run slightly worse as it works to keep your monitor in sync. Adaptive refresh rates solve this issue almost entirely.
So which is better, 60 Hz or 144 Hz?
In general higher is almost always better when it comes to refresh rates. Since the refresh rate of your monitor defines the upper limit of FPS that it's able to display it's nice to have that speed limit as high as you can get it. A 60 Hz monitor will only be able to display a maximum of 60 FPS, whereas a 144 Hz monitor will be able to display a maximum of 144 FPS.
But keep in mind that regardless of the refresh rate of your monitor you're going to have to contend with occasional tearing unless you have an adaptive refresh rate capable monitor. In practice a lot of people have difficulty recognizing higher refresh rates, and usually you're more likely to notice that a game feels smoother rather than any visual difference. If you like faster paced games then seek out a higher refresh rate, if you mostly play slower titles then 60 Hz is probably fine. But in both cases a monitor with adaptive sync (FreeSync or G-Sync) is probably a better upgrade than a higher refresh rate on its own.
Why does it matter?
A higher refresh rate can display higher frame rates, which results in a more responsive, smoother game playing experience. If you're playing slower or turn-based titles then you probably won't notice much difference between 60 FPS and 144 FPS. But those who enjoy very fast-paced titles will undoubtedly be able to feel a difference, even if it's a difference in feel rather than looks.
Refresh rates and frame rates are becoming a larger focal point as computers, graphics cards, and game consoles continue to progress. Sony and Microsoft both support 120 Hz refresh rates on the PlayStation 5 and Xbox Series X, and high refresh rate capable graphics cards have been on the market for years. So even if you're not looking to enhance your gaming experience right now because your system is older and not capable of higher refresh rates, it's still worth looking at higher refresh rate monitors and televisions to ensure that your next device is capable without having to upgrade your display to match. It's often worth spending a few extra dollars now to get a display that you can use for years to come. With console manufacturers pushing so hard for high, stable FPS it's likely that the entire tech industry will begin pushing for the same.
It's also important to note that 60 Hz and 144 Hz are not the only refresh rates that you're likely to see out there. Technically a refresh rate can be whatever the manufacturer wants it to be, but the most common ones you'll see are 60, 75, 144, 165, and 240. Keep in mind that diminishing returns mean that unless you're the more discerning type you're not likely to notice much difference above the 144 Hz range.
So in the end, like so many other hardware recommendations, the answer boils down to: it depends. But in general it's always worth going for the hardware within your budget that will remain viable for the longest amount of time. Higher, variable refresh rate capable monitors will provide a much longer useful life than their lower, fixed variable refresh rate siblings, and may also help you avoid another upgrade in a year or two.
For more hardware recommendations check out 2021’s Best Gaming Headsets Under $60| Best Budget Gaming Headphones and 2021’s 5 Best Streaming Microphones.
Published: Aug 25, 2021 09:52 am