6 reasons not to buy an HDR monitor for a gaming PC
PC gaming focuses on optimal performance, customizability, and graphic fidelity. If you can afford to buy high-end hardware for the best possible experience, a high refresh rate HDR monitor seems the obvious choice. Unfortunately, the HDR experience on a PC is vastly different from the experience you get on consoles.
Due to limited monitor options, below-average optimization, and high cost, HDR gaming on PC won't live up to your expectations. With the right screen and the right game, it can look great, but here are a few reasons why HDR might not be worth your while.
1. Limited screen options
You have a lot of options to consider when shopping for a new monitor. However, once you decide that you need a monitor with HDR, your options start to become very limited. True HDR requires a high contrast ratio (10,000:1 or higher), a maximum brightness of 1,000 nits, and support for a wide color gamut.
There are plenty of HDR-enabled monitors on the market, but only a select few meet the minimum requirements we've just discussed. If you live in an area where it's hard to buy high-end hardware, you'll have a hard time finding a True HDR monitor that delivers the experience you've come to expect.
Of course, there are some monitors that perform admirably when it comes to HDR gaming, but in some cases you might end up paying more than you spend on the entire PC.
2. "Trap" HDR400
If you've bought an HDR monitor and don't like the experience, you've likely fallen into the HDR400 "trap". True HDR (HDR10) displays require a peak brightness of at least 1,000 nits, but you can find many monitors that are HDR400 certified. If the monitor has a wide color gamut, you'll get better color reproduction and stand out in bright scenes with HDR400.
However, the HDR400 display is not flat when compared to the real HDR10 display. This is because HDR400 displays do not require local dimming, a feature that dims the backlight to improve black depth.
Since HDR400 monitors lack this feature, they rely on maximizing the backlight to achieve that HDR image. This makes blacks a bit gray, and the problem is especially bad with IPS screens. Conversely, you don't get the improvements either. For some games, a properly calibrated SDR (Standard Dynamic Range) monitor will be better than an HDR monitor.
3. Metadata problem
Even if the screen has a maximum brightness of 1,000 nits and supports the HDR10 feature, the problem does not stop there. HDR10 uses static metadata, meaning colors and brightness need to be fixed at the start of the game. This means that everything is calibrated once from the start, and getting it right isn't easy for game developers.
Conversely, monitors with dynamic metadata support or HDR10+ are better priced. With dynamic metadata, you get dynamic brightness and color on a frame-by-frame basis. This is why games with HDR support can look great on certain monitors but look bad on others.
4. You will have to spend more time adjusting the settings in the game
HDR is what will work seamlessly and enhance the immersive experience of your gaming sessions. However, it can quickly become a source of frustration when it doesn't work. You'll likely end up spending a lot of time in the game menu, fiddling with lighting and other graphics settings.
In addition to gaming, Windows also has many bugs when using HDR monitors. There are some color rendering issues with certain monitors, HDR on/off doesn't always work, and sometimes even SDR content looks dimmer than usual. Some of these problems have been fixed, but your case may be different because Windows is inconsistent when it comes to fixing errors.
5. Most games are not optimized for HDR
One of the many common myths surrounding PC gaming is that you need high-end hardware to fully enjoy it. However, most people are rocking low- or mid-range budget systems that can comfortably game at 1080p or 1440p. The premium market is smaller than you think, and HDR gaming is even smaller by comparison.
Developers don't want to spend a lot of time or budget creating an experience that only a few people can enjoy. The experience you get from HDR-enabled games varies from game to game. One game may look great, but others don't. By comparison, the HDR experience is often better on consoles because those games are developed with specific hardware in mind.
6. You need a powerful PC
Obviously you care about the picture if you're thinking of buying an HDR monitor. This means turning all in-game graphics settings to maximum and enabling HDR when available.
Now, while HDR itself doesn't require any extra graphics power, gaming at maximum settings requires high-end hardware.
The problem is that high-end hardware is not accessible to most people. You can buy a cheap or mid-range PC and pair it with an expensive HDR monitor, but that doesn't make much sense. You should spend more money on performance rather than on a feature that only works well with certain games.
You should read it
- How to turn on 144Hz (or higher) on dedicated gaming screens
- Top 10 best gaming screens under 5 million
- Experience choosing to buy the best computer monitor
- Should I choose an ultra-wide monitor or a dual monitor setup?
- LG 27GP850 monitor review: 2K 180Hz resolution, harmonizing gaming and creative work
- Where to buy a blood pressure monitor?
- Learn about Activity Monitor on Mac
- How to Connect a Gaming Console to a Computer Monitor
- Top 5 Gaming monitors worth buying 2021
- 5 ways to open Resource Monitor in Windows 10
- Buy the best blood pressure monitor
- 11 ways to start the Performance Monitor performance monitor in Windows
Maybe you are interested
How to Get the Most Out of HDR on Windows 11
Windows 11 is about to offer better control over HDR features
Huawei's rising laptop business is threatened because the US withdrew export licenses with Intel and Qualcomm
Apple withdrew WhatsApp and Threads from the App Store in China
What is HDR10+?
Should HDR be enabled on computer screens? Answers