Home » Uncategorized » HDR Gaming in 2025: Why It’s Broken and How It Can Still Be Saved

HDR Gaming in 2025: Why It’s Broken and How It Can Still Be Saved

by ytools
2 comments 12 views

Almost a decade ago, HDR was sold to gamers as the next great visual revolution. When Sony prepared its mid-generation refresh of the PlayStation 4 – the console that would eventually launch as the PS4 Pro in November 2016 – the big buzzword was not just 4K, but High Dynamic Range.
HDR Gaming in 2025: Why It’s Broken and How It Can Still Be Saved
HDR was supposed to change everything: brighter highlights, deeper shadows, richer colors, and a genuine sense of looking through a window rather than at a flat panel. Microsoft and AMD were already boasting that early user studies showed players were more impressed by HDR than by higher resolution. Unlike 4K, we were told, HDR would not tank performance. It sounded like a rare free upgrade in a world of trade-offs.

In those early years, the promise felt real. Developers like Naughty Dog and Playground Games openly gushed about HDR being an “enormous benefit” and something you could not unsee once you experienced it. NVIDIA paraded G-SYNC HDR monitors at CES 2017, promising buttery-smooth frames and eye-searing highlights. The HDR Gaming Interest Group (HGiG) brought together platform holders, TV manufacturers, and publishers in 2018 to align on best practices. On paper, the entire ecosystem looked ready: consoles, PCs, TVs, game engines, and middleware all circling around the same goal – making HDR gaming the new default.

Fast-forward to the end of 2025, and that dream feels strangely distant. We live in a world where it is hard to buy a TV that does not advertise HDR support, where mid-range OLED panels are creeping into living rooms, and where many premium displays can blast well beyond 1,000 nits of peak brightness. Yet some of the biggest games of the year still ship with no HDR at all, or with implementations so broken that players immediately turn the feature off. A technology that should have been invisible – just “how games look” – is instead a lottery ticket. Sometimes you win, often you don’t.

Few people understand both the promise and the failure of HDR gaming as well as Filippo Tarpini, a graphics programmer and color-pipeline obsessive who has quietly become one of the most influential figures in this niche. He helped shape HDR in Control and Alan Wake 2 at Remedy, built the Luma HDR mod for games like Starfield, and now runs a small company, Gamma Studios, whose entire mission is to rescue post-processing and HDR pipelines in modern games. Through his work and his modding community, he has seen the best and the worst of HDR – and he is brutally honest about where things stand.

The HDR revolution that never quite arrived

To understand how we ended up here, it is worth remembering why HDR felt so exciting in the first place. Standard Dynamic Range content – what we now casually call SDR – is effectively locked to a reference brightness of 100 nits and a Rec. 709 / sRGB color gamut. That was fine in the CRT era, but it is wildly underutilizing what modern panels can do. HDR blows those limits open: highlights can hit 1,000 nits, 2,000 nits, or even more, while color spaces like Rec. 2020 allow far more saturated and nuanced hues than sRGB ever could.

On a good HDR display, sunlight reflecting off metal or water can feel genuinely uncomfortable to look at, in a good way. Neon signs and spell effects gain volume and intensity. Fog, smoke, and dusk scenes have soft gradations instead of chunky banding. When artists and engineers get it right, HDR does not simply make the picture “pop”; it adds depth, realism, and emotional weight to the image. Once you get used to a well-calibrated HDR game on a quality OLED or mini-LED display, it is genuinely hard to go back to flat SDR.

Crucially, all of this comes with little inherent performance cost. HDR is more about how you encode and map the final image than about increasing the number of pixels or ray-traced bounces. That is why so many early evangelists called it more impactful than 4K. A 1080p or 1440p image in well-implemented HDR can be far more impressive than a soft pseudo-4K image stuck in SDR. For a medium obsessed with cinematic spectacle, it seemed like an obvious win.

But there was a hidden catch: HDR sits at the very end of the rendering pipeline, in the world of tonemapping, color grading, and display encoding – a part of graphics that many studios had treated as a black box for decades. That blind spot is exactly where Tarpini chose to focus his career.

2025: when half of the biggest games still ignore HDR

On paper, HDR has never been more accessible. The overwhelming majority of mid-range and high-end TVs boast respectable peak brightness, decent contrast, and some flavor of HDR certification. OLED, in particular, has rapidly become the de facto standard for enthusiasts and increasingly for mainstream living rooms. Yet the software side has not kept pace.

Tarpini’s blunt assessment of 2025’s HDR landscape is sobering. Out of roughly 25 of the year’s biggest, most talked-about releases, he estimates that about half shipped with no HDR at all. Of the half that do advertise HDR, most have serious issues: crushed blacks that erase shadow detail, raised blacks that make everything look washed out, skewed color hues, clipped highlights, bizarre calibration menus, or UI elements that blow out like a flashbang. In many of these games, you can instantly tell that SDR was the reference; HDR feels like a filter someone turned on at the end.

We have seen headline titles launch without HDR on day one, only to add it hastily in a later patch. Others – including some of 2025’s most anticipated RPGs and action games – still lack support entirely, even though they target premium platforms. Meanwhile, a few high-profile releases roll out with HDR modes that visibly break color grading compared to SDR, as if the art team carefully painted one version of the picture, and a separate team simply stretched it into HDR space with minimal oversight.

This is particularly painful for players who invested in high-end displays: OLED TVs or mini-LED monitors capable of 1,500 to 2,000 nits, with wide-gamut panels that can cover most of Rec. 2020. Those users paid for precisely the kind of hardware that shows HDR at its best, yet they are repeatedly told by real-world experience that HDR in games is unreliable at best. Among PC players, it is now common to see HDR dismissed as a gimmick or a “broken” feature – often because their first impressions came from a bad implementation on a mediocre monitor.

Ironically, this negative reputation is not because HDR is inherently flawed, but because so many studios treat it as optional decoration rather than a core part of how a game looks and feels.

Why developers keep treating HDR as an afterthought

From a distance, it is easy to ask why developers are so reluctant to embrace HDR in a world where HDR TVs are everywhere. The answer, as usual, comes down to a mix of economics, perception, and pipeline pain. Every game lives under hard constraints: limited budget, limited time, and a long list of features competing for attention. HDR, unlike gameplay systems or obvious marketing bullet points like ray tracing, is rarely seen as make-or-break.

According to Tarpini, one of the biggest structural problems is that games work “fine” without HDR. You can ship a visually impressive, critically acclaimed title that never outputs an HDR signal and still sell millions of copies. That makes HDR easy to postpone. Features that are clearly visible in screenshots and trailers – ray-traced reflections, shiny particle effects, volumetrics – tend to win internal battles for resources. GPU vendors actively court developers to implement those high-profile features, often providing engineering support or co-marketing. Nobody is knocking on the door simply to say, “Hey, let us help you do correct HDR encoding.”

On top of that, HDR has a perception problem among developers themselves. Many still think SDR is a perfectly standardized, predictable baseline and that HDR is the messy, inconsistent newcomer. The reality is the opposite. SDR as implemented in the wild has never been tightly standardized: brightness, gamma curves, and color characteristics vary wildly between displays. A lot of the headaches developers face when they first move to HDR are actually artifacts of long-standing SDR quirks that were never properly understood. When you leave the SDR bubble and start measuring actual HDR output on real panels, those mismatches become glaring.

Game engines add another layer of confusion. On paper, both Unreal Engine and Unity can enable HDR output with a simple checkbox. In practice, that checkbox sits on top of a complex color pipeline that was often built for SDR first, then extended in ways that are technically valid but artistically brittle. It is not unusual for teams with little internal color-science expertise to flip the switch, see something that looks “off”, and either ship anyway or quietly disable HDR late in development.

Ultimately, everything depends on whether a studio has at least one person who truly cares about post-processing, tonemapping, and display encoding – someone who is willing to spend late nights chasing down subtle shifts in luminance and hue. That kind of specialist is still rare, and when they are missing, HDR is the first thing to suffer.

Meet the HDR fixer: Filippo Tarpini and Gamma Studios

Tarpini’s own path into this obscure corner of rendering is unusually focused. After years in mainstream game development, he realized that the last stage of the pipeline – the part between “the game renders a floating-point image” and “photons hit your eyeballs” – was both critically important and badly misunderstood. Tonemapping, color grading, and encoding were treated as a set of sliders and LUTs, not as a coherent engineering discipline. That gap between how games were authored and how they were actually displayed nagged him enough to become his main obsession.

His toolset for fixing that gap spans both official work and modding. At Remedy, he contributed to the much-praised HDR patch for Control, originally created as a personal project before being folded into the game. Later, he helped shape the HDR pipeline for Alan Wake 2, whose calibration system and visual consistency in HDR have been widely praised by display enthusiasts. These experiences convinced him that with the right approach, HDR can absolutely match and even elevate an artist’s original SDR intent.

Outside official studios, Tarpini’s name became widely known thanks to Luma, an open-source HDR modding initiative that retrofits modern HDR pipelines into games that either shipped with broken HDR or never supported it at all. Luma mods have been used to transform the look of titles like Starfield, Prey, and Hollow Knight: Silksong, often becoming the de facto recommended way to play those games on HDR displays.

That grassroots work eventually led to Gamma Studios, a small company he founded specifically to help teams “improve their post-processing pipelines, implement HDR output, and do modern color grading”. The studio consults with developers of all sizes and is also working with NVIDIA to integrate robust HDR support into the RTX Remix remastering platform. While many of his current projects are covered by NDAs, the pattern is clear: studios ship with shaky or nonexistent HDR, players complain, and somewhere down the line Tarpini is called in to clean up the mess.

Parallel to Gamma, he helps run HDR Den, a community hub that lives across Discord and Reddit. There, a small army of modders and display obsessives dissects games, shares measurements and test results, and collaborates on experimental fixes ranging from simple tone-curve tweaks to sophisticated shader patching systems. In some cases, community research has outpaced big studios: one of the group’s more ambitious achievements is a unified mod that repurposes existing TAA code paths in Unreal Engine 4 games to behave more like DLSS or FSR – a reminder that the frontier of image quality is often pushed by a handful of passionate individuals rather than massive R&D budgets.

When HDR actually works: the rare success stories

For all the frustration, there are games that showcase what good HDR can do. Tarpini frequently cites titles like Red Dead Redemption 2 and the recent Dead Space remake as examples that, while not perfect under a forensic 2025 technical eye, still deliver a coherent and compelling HDR experience. Their strength lies less in immaculate standards compliance and more in a strong artistic vision that carries through into HDR without collapsing.

A handful of Sony first-party games and some entries in the Call of Duty series also fare well, largely because their teams treat HDR as a first-class citizen. They test extensively on real hardware, tune calibration for different scenarios, and resist the temptation to diverge too far from the intended SDR look. The result is not necessarily a “wow” fireworks show – in fact, good HDR often looks understated – but it feels correct, grounded, and consistent.

Beyond those, HDR enthusiasts often point to more niche examples. Some players single out No Man’s Sky as a particularly striking implementation, with its alien skies, nebulae, and glowing flora making full use of saturated colors and brightness range without completely abandoning plausibility. Others highlight moments in games like Cyberpunk 2077, where neon-soaked cityscapes and harsh sunlit streets can genuinely feel transformative on a capable OLED.

What all these successes share is not a single magic setting, but a philosophy: HDR is treated as the reference output, not a post-launch checkbox. Artists sign off on it, engineers respect the intent, and calibration options are designed to guide players toward the correct look rather than dumping a dozen sliders on them and hoping for the best.

PC HDR: fake panels, broken pipelines, and user frustration

If HDR is inconsistent on consoles, it borders on chaotic on PC. This is where many of the most furious user complaints originate, and not without reason. Unlike consoles, where a fixed platform makes it feasible for Sony or Microsoft to enforce certain quality bars, PC is a wild west of panels, drivers, cables, OS quirks, and vendor-specific marketing terms.

For years, the market was flooded with so-called “HDR” monitors that technically met loose certification requirements but were fundamentally incapable of delivering meaningful contrast or brightness. LCD panels without local dimming, with peak brightness under 400 nits and raised black levels, were still sold under HDR badges that meant little in practice. On these displays, enabling HDR could make the image look worse than SDR – flatter, greyer, and more washed out. No amount of in-game slider tweaking could fix basic hardware limitations, yet players understandably blamed “HDR” as a concept.

Windows, meanwhile, has struggled to offer a user-friendly HDR experience. Desktop SDR content shown in HDR mode often looks off: gamma is wrong, colors drift, and some apps behave unpredictably. Auto HDR, Microsoft’s attempt to algorithmically upgrade SDR games to HDR, can sometimes produce impressive results, but it also adds another layer of unpredictability to a pipeline that many users barely understand. When you combine questionable monitors, shaky OS behavior, and games that were never authored with HDR in mind, you get exactly the kind of horror stories that dominate forum threads.

This is why you will find PC players angrily dismissing HDR as a proprietary gimmick, a “10-bit rebrand” pushed by standards bodies and cable manufacturers more interested in licensing fees than in visual fidelity. Some call it a “rich people feature” that only truly shines on expensive OLED TVs that most enthusiasts still do not own. Others frame the debate as a clash between the open, DIY ethos of PC and what they perceive as closed, curated, proprietary ecosystems of console and TV vendors.

Tarpini does not deny that the PC situation is messy. But he argues that the answer is not to abandon HDR, but to set clearer expectations. On genuinely bad “fake HDR” displays, the correct move is not to twist HDR calibration until the image looks halfway acceptable; it is to accept that SDR is the better choice and let users stick with it. HDR should be reserved for hardware that can show meaningful improvement. When developers try to support every possible panel and scenario, they often end up compromising the experience for everyone.

Standards, HGiG, Dolby Vision, HDR10+: help or hindrance?

One of the biggest sources of confusion around HDR is the alphabet soup of formats and labels. HDR10, HLG, Dolby Vision, HDR10+, HDR10+ Adaptive, HDR10+ Advanced – the names alone are enough to make many players throw their hands up. It is tempting to assume that more advanced formats like Dolby Vision automatically guarantee a better gaming experience. In practice, they are rarely used in games at all, especially on PC.

Tarpini sees potential in these advanced formats but views them primarily as attempts to simplify rather than fundamentally reinvent HDR. Dolby Vision and HDR10+ offer dynamic metadata that can theoretically help displays adapt more intelligently to content, but they also add complexity for developers and are limited by compatibility. In a world where many studios barely get static HDR10 right, expecting them to embrace more advanced, proprietary ecosystems is optimistic at best.

The more important, and often misunderstood, piece of the puzzle for gaming is HGiG. Rather than being a fancy new format, HGiG is essentially a passive mode: it tells compatible TVs to stop doing their own tone-mapping and instead reproduce the game’s signal as faithfully as possible. The idea is simple: let the console or PC handle tone-mapping with full knowledge of the game’s artistic intent, and let the display act like a neutral canvas.

Paradoxically, calibrating HDR gaming in this scenario can be very straightforward. A well-designed game only needs a couple of sliders – usually something like overall brightness and UI brightness – plus a way to query the display’s peak brightness from the OS. That is exactly what Alan Wake 2 does, and what many of Tarpini’s mods replicate. Problems start when developers overthink it. Some games expose half a dozen HDR-specific sliders for exposure, shadows, highlights, contrast, and saturation, with no clear indication of what the “intended” look actually is. In many cases, the calibration screen itself is rendered in SDR or uses images that do not accurately reflect how the game will respond to each setting.

These hyper-complex menus are often born from the mistaken belief that HDR displays vary so wildly that users must be given total control, when in fact modern HDR TVs are generally more accurate to the source than consumer SDR displays ever were. The result is the worst of both worlds: gamers are overwhelmed, settings get mangled, and online guides sprout up explaining how to “fix” HDR that was broken by design.

Color grading chaos: how SDR pipelines sabotage HDR

Behind every game’s final image is the work of artists: lighting teams, environment artists, VFX specialists, and colorists who shape mood, readability, and style. Almost all of that work still happens in SDR. The tools they use – from DCC applications to grading tools and reference monitors – are overwhelmingly SDR-centric, a legacy of decades of broadcast and film workflows built around Rec. 709.

That would not be a problem if HDR were treated as the new reference and SDR as a derived fallback. Instead, in most studios, HDR is a bolt-on. The art team signs off on the SDR look, the game approaches shipping, and the engine or rendering team is tasked with adding an HDR output path that loosely preserves the original artistic intent. Often, that addition is driven by technical engineers with little day-to-day contact with the artists who crafted the original look. The result is a disconnect: the HDR version looks different enough to bother those artists, but not important enough to justify delaying a ship date.

In some cases, art leadership simply vetoes HDR shipping at all, especially in Unreal Engine projects where the HDR path does not cleanly match the approved SDR grading. In others, they reluctantly accept an HDR mode that diverges from their vision because marketing wants an HDR badge on the box, or because platform holders strongly encourage it. Either way, the artistic intent is compromised.

Part of the problem is educational. Reliable, up-to-date information on HDR color pipelines is hard to find, scattered across technical papers, forum posts, proprietary documentation, and reverse-engineering work. Many existing graphics and imaging tools simply assume SDR. Pipelines that try to adapt them to HDR do so in bespoke, fragile ways, where a single small mistake in a transfer function calculation can dramatically alter the final look. Without dedicated color-science expertise, it is easy for teams to convince themselves that “HDR is just like SDR but brighter”, only to discover late in production that their assumptions were wrong.

This is exactly the gap HDR Den and projects like Luma try to fill. By sharing knowledge, reference workflows, and even code, they give both modders and professional developers a place to learn without spending months reinventing the wheel. But as long as the mainstream toolchain remains SDR-first, studios will continue to treat HDR as a risky late-game addition rather than a foundational part of how a game is built.

How the community sees HDR: hype, backlash, and misunderstanding

Spend any time reading comment sections under HDR articles or videos and a pattern quickly emerges. The audience is split into camps that often talk past each other. On one side are players who have seen HDR at its best – usually on a good OLED or a high-end mini-LED TV – and who insist that once you experience it, there is no going back. They describe being legitimately dazzled by shafts of sunlight or blinded by magic effects, and they will happily take good HDR over ray tracing in terms of pure perceived impact.

On the other side are players whose primary encounters with HDR have been on PCs with entry-level monitors and inconsistent OS support. For them, HDR feels like the worst PC feature of the last decade: inconsistent, buggy, and suspiciously tied to proprietary ecosystems. Some argue that HDR is basically just a fancy label for 10-bit color depth, something that should have been standardized cleanly at the hardware level instead of wrapped in marketing and licensing schemes. Others frame the whole HDR push as a way for TV makers, HDMI vendors, or GPU companies to sell “shiny buttons” that do not meaningfully improve the experience for the majority still using 8-bit panels.

Debates can get heated. Enthusiasts sometimes accuse skeptics of never having seen “real HDR” in person, or of writing it off after only encountering bad implementations. Skeptics fire back that the burden should not be on consumers to own expensive displays and to debug broken pipelines just to get a promised upgrade. Somewhere in the middle are users who like HDR on console but hate it on PC, or who find that in many games it simply makes the image look over-saturated and cartoonish rather than more natural.

Underneath the snark and trolling, there is a legitimate point on both sides. HDR can be transformative, but only when the entire chain – hardware, OS, engine, and game – is aligned. Right now, that alignment is rare enough that many players encounter the bad or mediocre versions first. From Tarpini’s perspective, this is the real tragedy: a technology with enormous potential is being judged largely by its worst implementations.

What needs to change: a realistic roadmap for better HDR gaming

Fixing HDR gaming does not require a miracle; it requires discipline, collaboration, and a willingness to treat color and brightness as first-class citizens rather than magical output stages. Tarpini’s wish list for the industry is surprisingly pragmatic.

First, platform holders like Sony and Microsoft could strengthen their certification requirements for HDR. Instead of merely checking that a game outputs an HDR signal, they could verify basic correctness: does the game respect the display’s reported peak brightness? Is the tone-mapping curve sensible? Is the calibration menu clear, minimal, and aligned with HGiG best practices? Are UI and gameplay elements readable across a range of scenes? Making these checks part of standard QA would force studios to think about HDR earlier and more thoroughly.

Second, the industry needs a clearer, more widely adopted set of guidelines for how to author HDR content in the first place. That includes agreeing on reference color spaces – for example, moving away from Rec. 709 toward wide-gamut spaces like Rec. 2020 for albedo textures and light colors – and providing straightforward reference implementations of tone-mapping and encoding. A formal consortium focused specifically on HDR in games, rather than just displays, could help here.

Third, OS-level brightness calibration could dramatically reduce complexity. If consoles and PCs exposed a standardized, system-wide “paper white” setting and peak brightness measurement, games would no longer need bespoke seven-page calibration wizards. They could simply respect the OS values and offer a small nudge for personal preference. This is one area where Alan Wake 2 and some modern mods already point the way forward.

Fourth, critics and reviewers could start treating HDR quality as a core part of graphics coverage, rather than an afterthought. Just as ray tracing implementations are scrutinized, HDR should be called out when it is broken, misleading, or missing from games that clearly target high-end visuals. That kind of public pressure, especially on big releases, would make it harder for publishers to treat HDR as free marketing brownie points.

Finally, studios need to be honest about low-end hardware. On displays that barely meet HDR specs or that have obvious flaws, the correct answer may simply be to default to SDR and explain why. Let HDR be a premium experience on capable panels instead of trying to stretch it across everything. Over time, as better hardware becomes more common, that premium will naturally turn into the norm.

The next five years: will SDR start to look unplayable?

Despite all the criticism, Tarpini is not pessimistic about HDR’s long-term future. In fact, he is convinced that once the industry gets over its current growing pains, SDR will begin to feel as archaic as 480p resolution does today. As HDR-native workflows mature, as more games are authored directly for wide color gamuts and high peak brightness, and as TVs continue to improve, the difference between a well-implemented HDR game and an SDR title will grow too big to ignore.

In that future, SDR versions of games may feel strangely flat and lifeless – a necessary fallback for compatibility, but not the way anyone wants to play. Every game that ships today with broken or missing HDR is, in a sense, locking itself into an inferior presentation for the long term. Patches and mods can help, but they are rarely universal. This is what makes the current trend so frustrating for people like Tarpini: the industry is leaving a huge amount of visual quality on the table, at a time when hardware is finally ready to show what HDR can do.

The good news is that the pieces for a better HDR future already exist. Passionate modders are proving what is possible. A handful of studios are quietly setting strong examples. Communities like HDR Den are consolidating knowledge that used to live in scattered PDFs and private Slack channels. Standards bodies and platform holders are at least aware of the stakes.

What is missing is a collective decision to treat HDR not as a flashy sticker or a checkbox to be toggled at the last minute, but as an integral part of how games are made and judged. If that shift happens, the next wave of titles will make today’s half-baked implementations look as dated as early, jagged 3D graphics do now. If it does not, HDR risks becoming the gaming equivalent of 3D TV: a technology with real potential that never quite escaped its own missteps.

For now, HDR gaming sits at a crossroads. Players argue in comment sections, some blinded by bad experiences, others evangelizing the few great examples they have seen. Developers juggle deadlines and decide whether fixing HDR is worth the headache this sprint. Somewhere in between, people like Filippo Tarpini and his collaborators keep tinkering with shaders, color curves, and patch tools late into the night, trying to drag the medium’s most misunderstood feature into the future it deserves.

You may also like

2 comments

BinaryBandit December 18, 2025 - 10:35 pm

Got my first OLED this year and when a game actually does HDR right it’s insane. Sunlight in Alan Wake 2 literally made me squint. I’ll take that over fancy ray tracing reflections any day of the week

Reply
binance referral bonus January 14, 2026 - 8:34 pm

Thank you for your sharing. I am worried that I lack creative ideas. It is your article that makes me full of hope. Thank you. But, I have a question, can you help me?

Reply

Leave a Comment