HDR is laughably bad for PC gaming in most cases, and we all know it’s true.
That might surprise you, though, if you were only considering how gaming monitors are advertised. After all, on paper, HDR is technically supported by your monitor, games, and graphics card. Heck, even Windows supports HDR relatively free of bugs these days.
So, who’s to blame then? Well, when I dug down deep for the answer, I found three major culprits that explain our current predicament. And even with some light at the end of the tunnel, this multifaceted problem isn’t about to just go away on its own.
The game problem
I need to start by laying the ground on why HDR is a problem specifically with PC games. It’s a highly variable experience depending on what display you have and what game you’re playing, which makes this whole HDR mess all the more confusing on PC. The big reason why is static metadata.
There are three main HDR standards: HDR10, HDR10+, and Dolby Vision. The latter two support dynamic metadata, which basically means they can feed the display information dynamically based on what the monitor is capable of and the scene you’re in (even what frame is currently on screen). HDR10, on the other hand, only has static metadata.
Dynamic metadata is a big reason why console HDR is so much better than HDR on PC.
Only a select few monitors support Dolby Vision, like Apple’s Pro Display XDR, and none of them are gaming monitors. There are a few HDR10+ monitors, but they’re exclusively from Samsung’s most expensive displays. The vast majority of monitors are dealing with static metadata. TVs and consoles widely support Dolby Vision, however, which is a big reason why console HDR is so much better than HDR on PC.
As former game developer and product manager for Dolby Vision Gaming Alexander Mejia points outstatic metadata creates a big problem for game developers: “There are more and more HDR TVs, monitors, and laptops on the market than ever, but if you grab a couple from your local big-box retailer, your game is going to look drastically different on each one … How do you know that the look you set in your studio will be the same one the player sees?”
On my Samsung Odyssey G7, for instance, Tina Tiny’s Wonderlands looks dark and unnatural with HDR turned on, but Devil May Cry 5 looks naturally vibrant. Look up user experiences on these two games, and you’ll find reports ranging from the best HDR game ever to downright terrible image quality.
It doesn’t help the matter that HDR is usually an afterthought for game developers. Mejia writes that developers “still need to deliver a standard dynamic range version of your game — and creating a separate version for HDR means twice as much mastering, testing, and QA. Good luck getting sign-off on that.”
There are numerous examples of developer apathy toward HDR. The recently released Elden Ring, for example, shows terrible flickering in complex scenes with HDR and motion blur turned on (above). Turn HDR off, and the problem goes away (even with motion blur still turned on). And in Destiny 2, the HDR calibration was broken for four years. HDTVTest found that the slider didn’t map brightness correctly in 2018. The issue was only fixed in February 2022 with the release of The Witch Queen expansion.
Games are a of issues for HDR on PC, but it’s a consequential problem: A problem that stems from a gaming monitor market that seems frozen in time.
The monitor problem
Even with the numerous Windows bugs that HDR has caused over the past few years, monitors are the main of HDR problems. Anyone in-tune with display tech can list the issues without a second thought, and that’s the point: After years of HDR monitors flooding the market, displays are mostly in the same place they were when HDR first landed on Windows.
The traditional knowledge has been that good HDR requires at least 1,000 nits of peak brightness, which is only partially true. Brighter displays help, but only because they’re able to drive higher levels of contrast. For example, the Samsung Odyssey Neo G9 is capable of twice the peak brightness as the Alienware 34 QD-OLED, but the Alienware display offers much better HDR due to its exponentially higher contrast ratio.
There are three things a display needs to achieve good HDR performance:
- High contrast ratio (10,000:1 or higher)
- Dynamic HDR metadata
- Expanded color range (above 100% sRGB)
TVs like the LG C2 OLED are so desirable for console gaming because OLED panels provide massive contrast (1,000,000:1 or higher). Most LED monitors top out at 3,000:1, which is not enough for solid HDR. Instead, monitors use local dimming — independently controlling the light on certain sections of the screen — to increase contrast.
Even premium (above $800) gaming monitors don’t come with enough zones, though. The LG 27GP950-B only has 16, while the Samsung Odyssey G7 has an embarrassing eight. For a truly high contrast ratio, you need a lot more zones, like the Asus ROG Swift PG32UQX with over 1,000 local dimming zones — a monitor that costs more than building a new computer.
The vast majority of HDR monitors don’t even scratch the bare minimum. On Newegg, for example, 502 of the 671 HDR gaming monitors currently available only meet VESA’s DisplayHDR 400 certification, which doesn’t require local dimming, expanded color range, or dynamic metadata.
Spending up for a premium experience isn’t new, but this has been the case for four years now. Instead of premium features becoming mainstream, the market has been flooded with monitors that can advertise HDR without offering any of the features that make HDR tick in the first place. And monitors that check those boxes under $1,000 usually cut corners to do so with few local dimming zones and shoddy color coverage.
There are exceptions, such as the Asus ROG Swift PG27UQ, that offer a great HDR gaming experience. But the point remains that the vast majority of monitors available today aren’t too different from the monitors available four years ago, at least in terms of HDR.
Light at the end of the tunnel
The HDR experience on PC has been mostly static for four years, but that’s changing due to some fancy new display tech: QD-OLED. As the Alienware 34 QD-OLED shows, this is the panel technology that will truly drive HDR in PC games. And good news for gamers, you won’t have to spend north of $2,500 to access it.
MSI just announced its first QD-OLED monitor with identical specs to the Alienware one, and I suspect it’ll use the exact same panel. If that’s the case, we should see a wave of 21:9 QD-OLED monitors by the beginning of next year.
We’re seeing more OLED monitors, too, like the 48-inch LG 48GQ900 that was recently announced. They’re TVs marketed as gaming monitors, sure, but display makers are clearly in-tune with the demand for OLED panels from gamers. Hopefully, we’ll see some that come in the size of a proper monitor.
There are other display technologies driving better HDR performance, such as mini LED. But QD-OLED is the seismic shift that will finally, hopefully, makes HDR a reality for PC gaming.