SDR is way too dark, just compared it between Pixel 7 HDR and LG 4k IPS and the portion of video labeled as HDR on SDR Screen looks way closer to HDR on phone than to their "SDR" simulation. They crushed brightness to hard (as comparisons usually do) to make the difference bigger than it is.
I'm sure they will. They recently released the auto HDR feature for videos. So maybe they were waiting to make sure that works fine in the wild before they push out another feature. Or, maybe it's just one part of a suite of features for games they want to officially release at the same time.
I think it was one of the best looking ps3 games, just like the first killzone was one of the best looking ps2 games. Too bad the game itself was not that great. I feel like half life 2 set all my expectations for fps games very high and i've not been able to enjoy any shooters after it. We don't go to ravenholm...
HDR developer here. It's actually the opposite from what you suggested about contrast around minute 4 mark. SDR games that already have strong contrast and high exposure are the ones that are least compatible with any Auto-HDR method such as this one, while the ones that have low contrast and little to no burned out highlights are the one that will benefit the most out of any such Auto-HDR treatment, because in those cases, there actually is data there that Auto-HDR can work with, while in high contrast scenario, the data in below shadows and above highlights is irretrievably lost, and scaling the remainder into HDR brightness range results in burned out blown out picture with crushed shadows. Footage you're showing from Lost Planet is actually painful to look at on a properly powerful $3000 QD-OLED HDR monitor that I use for HDR production. I'd actually much rather play that particular game in SDR. It's also worth mentioning that most games are internally HDR, but this doesn't have any bearing on Auto-HDR methods efficiency, since externally all those games are still ultimately outputting in SDR and discarding all that internal HDR data.
@@garethsmith6611 If you'd like proof of me being an HDR developer, just google my name Mario Kotlar and you'll find that I work for Croteam, where I handled the HDR implementation for the Talos Principle 2. If you search Talos Principle 2 steam forums for HDR, you'll see a dev under a nickname Eagleshadow answering threads about HDR, that's me. If you google "mario kotlar eagleshadow" you'll find the demo reel I used to apply for work that I uploaded on my youtube channel named "Eagleshadow", and a short video recorded in Croteam offices, uploaded to Croteam's youtube channel titled: "Serious Sam VR: The Last Hope - 'Eagleshadow' avoiding projectiles", with video description stating "Our level lead for SSVR, Mario Kotlar aka 'Eagleshadow'..." That should prove that I am indeed a professional game developer, and that I handled HDR implementation in our latest title.
@@RicochetForce Not surprising, unfortunately. I actually like Mankind Divided but man did that game end abruptly. And then it got panned critically which was pretty much the nail in the coffin.
@@SpartanArmy117 "And then it got panned critically" - This is just a straight up lie. It's sat on 84 on metacritic, which is higher than a lot of well received games, and its stock has only risen over the years. As an example, its hub often comes up when talking about best designed worlds.
@@Hazzehh Yup. Blur and Split/Second gave me hours of fun when I was a kid and they still hold up awesomely despite the latter's PC port being depressingly barebones.
Alex, not sure if this is an issue with RU-vid HDR encoding, or if you’re unaware of how HDR mastering works; HDR is not really about blindingly bright highlights (nor is it really about true blacks either) - it is about being able to **perceive details** in both dark and bright areas of an image *at the same time* - because that’s what our eyes (and cameras) do! Although the tech looks cool and I’d definitely prefer to play many of these games this way, some of the examples you chose are simply horrible representations of HDR content, such as the blinding snow in Lost Planet or the white sky outside on half life or in Blur; You cannot see any detail at all in those parts of the images. It’s just pure white. A bunch of literally white pixels. What a good HDR image would actually look like in those scenes would be having the snow texture completely visible and discernible - with all its details and noise patterns - but with some 4000 nits sparkles, for example. Or, in the half life example with the stained glass windows, the details, textures, everything both in the interior _and_ in the sky should be visible at the same time. An extremely bright white sky simply makes no sense (both physically and even intuitively) and, at least to me, the image simply looks bad. Ps.: I know that’s how the game renders it (because, in fact, that’s the devs ~~abusing bloom because it was hot~~ trying to simulate HDR perception on an SDR container… though the tech could probably be programmed with certain ease to use the frame pre-bloom eg use the buffer before post process ps and use an heuristic to detect bloom shader). I’m just pointing out that those examples do not look “good” as far as HDR goes.
Exactly right. A nicely tuned OLED screen can really show what HDR is capable of... most people never have their panels tuned; but with time and trial you can get most OLED displays to display amazing HDR content, assuming the content was implemented correctly in the source. Alot of games just do bad job of it.
what you’re talking about is going to be incredibly possible by the end of the decade with some machine learning tech, brighter OLEDS, brighter QD-OLEDS, and improved mini-LEDs are going to make these older and more recent games look incredible you can see a hint of it with finely tuning re-shade with the the brightest QDOLEDs/OLEDs in a blacked out room. Nearing the end of the decade it’s going to be possible in a well lit room and the machine learning tech and tools will be out of this world.
Game Devs can't even implement HDR properly, HDR has been out for almost decades or am I misinformed? If they haven't mastered it now then they aren't ever going to do so@@endlessparadigm332
I agree with you overall like you are right but it does bring up an interesting question about artistic intent and value. Like sure, HDR is capable of showing detail and depth with an extreme range like that's the literal point of it in the first place. But let's get theoretical, say something like Lost Planet did have official real HDR at launch (just pretend), would they really have wanted to reintroduce visibility and perception for the sake of a detailed image? Or is the blinding bloom in all the highlights, and the oppressive darkness in the low range, all meant to convey a feeling of overwhelming disarray and so should be kept even in an HDR presentation? Point being, games are a creative medium and raw technical ability and precision isn't always going to compliment the actual experience the game is trying to convey. Sometimes things are meant to look, for a lack of a better term, a bit fucked up.
Just turn the "layered DXGI" option on globally. It improves all opengl and vulkan games by letting them run in hardware composed mode. It's similar to fullscreen optimizations for DirectX titles.
@@MaaZeus No display depth needed By Reshade Rtx Ray tracing like effect needs the deph of field to determin were the rays are going to bouce not as good as rayracing as it cant prdict oud of the display area properly.
Yea, no. You’re seeing bloom which was always overbearing in older games. Using HDR just brightens the bloom. Simple as that. It would still look blown out in SDR as you can see in the side by side comparisons lol.
@@SolidBoss7 I just tried the modded driver and it completely blows out all games. I think there's a good reason it's disabled at the moment. I'm going to wait for the official implementation in hopes of the terrible color crush being addressed.
@@tapsofosiris3110 you mean you tried the mod? The feature is already in the latest Nvidia driver, just hidden. I also went immediately and downloaded the mod from nexus and set it up with Deus Ex, just like Alex did here, and it looks phenomenal . Nothing is blown out. Sounds like you don’t have windows HDR setup properly or you’re on a bad HDR display. Looks great on my LG OLED just like it looked great in this video. You also probably forgot to disable AutoHDR in windows so you may be getting both at the same time. I simply think you shouldn’t be spreading misinformation when there’s literally proof right in front of your eyes in this video that not every game is “blown out” as you say. I don’t believe you’ve even tried it yourself.
@@SolidBoss7 I'm glad this looks good on your display. I have a Samsung Neo G8 and it looks like absolute blown out garbage. Native HDR games like Cyberpunk and Shadow of the Tomb Raider look fantastic because you can adjust the HDR luminance per-game. There is no comparably granular way to adjust any settings with this mod. That's why I said that this needs to be tuned per-game (maybe even per-display) in my original comment. Not everyone will have a display that will work out of the box with this mod's blanket settings. It is what it is.
Provided native HDR is implemented correctly (unfortunately not always the case) it will always be superior because with real HDR even if darks get darker and brights brighter you should not loose any details (in fact you should see more vs SDR) unless they go beyond the capabilities of your screen. And if your screen has good Dynamic Tone Mapping that is worth using (Panasonics and Sonys implementation for example) then not even that as those details get scaled down and fit to your screens limitations. As good as this RTX HDR looks it cannot dig out any details that are not there. If something is blown out or crushed, they will still be blown out and crushed, contrast is just more dramatic and lifelike than in SDR.
I'm curious about that too, just to see how it compares to poorly-implemented HDR. I'm currently playing Alan Wake 2 on PC on an LG C2, and while I'm not about to say the HDR is bad, the torch light seems to cause blow outs at times. Keen to see how RTX HDR would react to torch lighting in comparison.
@@MaaZeus No doubt, I think what he's referring to, or at least I would be, is seeing the differences. I'd assume it'd mostly boil down to better tone mapping across the wider color gamut... but it would be interesting to see just how close it is.
The main issue of UI elements being too bright isn't solved here. Unfortunate. And this video might have an issue because like a bunch of others all I see are blown highlights.
That is the one aspect I would have hoped the machine learning to have helped with. And yeah, blowouts abound over here on a 600-nit panel, but I would expect a proper release to allow customization of the target peak. Similarly, Control's appearance looked like it was mostly attributable to a gamma difference, but again, that _should_ become one of the core adjustable parameters.
That's much more complex, since it would require the tool to somehow guess which buffers to work on. I believe this should be possible with Special-K, but I haven't found out how yet.
It's not an issue with the video it's just representing blooming from that era of games in HDR. Intense bloom was super super common in that era of gaming.
The HDR calibration in this video seems to be really off btw. I'm using a professional colour calibrated monitor. Highlights in lost planet are extremely clipped
It does sometimes look cool, but I it doesn't seem like it does a good job of converting games to HDR and you can see that more clearly in the Lost Coast (8:43). HDR is supposed to increase dynamic range, not brightness. The increase in brightness is meant to preserve detail in brighter parts of the image that would be blown in SDR. Making these parts just and much blown and lacking detail but with much higher brightness is not increased dynamic range, just like stretching a standard definition picture to fill an antire 4K screen doesn't make it 4K, even if RU-vid or your TV says it's a 4K signal
A non-native HDR implementation can't possibly create detail in the highlights that were originally blown out. It can nonetheless increase the displayed dynamic range, as was clearly demonstrated in Control and Doom 3 at 15:20. Darker areas either stay just as dark, or get darker where it's warranted (would lend itself well in Cyberpunk), whereas highlights get brighter than they'd ever get in SDR.
the rtx hdr mod on nexusmods now has some optional files to tweak the peak brightness, saturation, contrast etc. I bet that would help. Also I'd like to see this compared to SpecialK HDR conversion.
@@Maxoverpower The AI could add a gradient up to full brightness to make it look less blown out (just as it does at the other end to reduce banding in shadows), but the added steps on the brighter end would be more artificial.
@@Maxoverpower but then it's nothing more than hdr retrofit tools that already exist (like special k, which does loads of other useful things) or particularly sophisticated reshade scripts that achieve the same results and are far more customisable. Sure, user side is way easier to "set and forget" (I'd argue that Special K is the same) but technology wise, if that's the result, than it's not really "extraordinary" in the current iteration...and does not really use AI in any meaningful way.
4:05 No, we need to talk about this error. HDR is all about expanding the DYNAMIC RANGE of a game/display/camera. A proper HDR image will have more highlight detail AND shadow detail simultaneously, will have a lower black floor and higher white ceiling simultaneously, as well as access to wider color gamuts. Any game that blows out highlights, crushes shadows, or does anything other than allow your EYES to control the perceived dynamic range is a bad example of HDR rendering (rather than a good one). In this case we're talking about an SDR to HDR conversion so that highlight detail was never going to be in the final image, but it does not change the fact that what you said here is wrong. That bloom is a sign of a compressed dynamic range, not an expanded dynamic range. This isn't to say that a good HDR experience should never allow blooming like this. Were you actually walking through that cave your eyes would need to adjust to the bright light source in real life, as they do this the shadows will be crushed. As you move away from the light and back into shadow, your eyes will bring the black floor down again, allowing you to see more clearly in the dark. Both the game camera and the display you are watching on should NEVER lose that detail, just your eyes. And a bonus edit: A little bit of camera shenanigans is okay so long as it's not oppressive. There is nothing worse than walking out of a dark cave into a scene with an APL of like 100 nits and not being able to see anything. Because that is a completely normal dynamic range for our eyes. These tricks can continue to work on displays under 1000 nits of brightness but as we cross the 2000 nit mark these tricks become less useful and more annoying. 7:59 Hehe, there's the issue. Really nice sky over there, it'd be pretty cool if I could see its details instead of a big white void. 8:42 Watch the sky bloom out at the top of the stairs, that's BAD. The complete opposite of HDR. Look I love you Alex, you're easily my favorite DF presenter by a MILE, but the first 9 minutes of video are so littered with misinformation that I'm genuinely in pain. 11:15 Okay, things are turning around in this section. You're making all the right observations. A lower black floor, maintained shadow detail as it fades to that deeper black, and less color banding. The only thing you're missing is kinda unfair of me to point out, but it's helpful for anyone reading this to know. Windows uses the sRGB gamma curve for SDR content in an HDR container, most games are rendered in gamma 2.2, this conversion and THEN upmapping causes Auto HDR and tonemapped SDR to exhibit an elevated black floor and color banding. This gamma mismatch can be corrected with an advanced ICC profile.
there's a surprising amount of "miss" in this video, I can't believe The examples he picked to show how "good" it looked that was some gaudy reshade/ENB level of terrible
@@MKR3238 I know it's kind of insane how the games he shows off as real lookers are the worst examples of this technology. Meanwhile a game he's not so impressed with, Warhammer looks absolutely fantastic. It's shocking how much HDR is still so misunderstood 10 years later. You don't need HDR to get a bright, contrasty image. Everything you see in Lost planet could be achieved with an SDR signal (just max out the brightness and saturation on your Tv). HDR is about maintaining detail with that high peak brightness and wide colour gamut. It's simply allows you to make use of the full capabilities of modern displays without banding or the loss of subtle details. Most HDR movies are on average darker than their SDR version (in terms of abl not peaks). SDR forces you to push everything into the midtones to achieve detail, whilst it's HDR's ability to allow for exceptional detail in dark scenes with sparing use ultrabright specular highlights that really sings.
@@MsMarco6 I'm not so sure about the last part. regular rec709 SDR movies are 100nits i think? HDR movies are often around 200 apl, at least from the Metadata in the movies i have, tho i guess this can be lower depending on the setting, e.g. movies predominantly at night etc could be totally wrong here tho
Limitations of the video probably but in many cases it is because of the game too. Many older games really went into town with Bloom effects if you remember. If they are blown out in SDR then they are also blown out in this simulated HDR, just more dramatic looking. It cannot dig out details that aren't there because it does not have access to the to the internal functions of the game rendering, it all happens in post.
@@MaaZeus no OP is right. I tried it in a couple game and while it handles darker area better, it completely ruins any bright area and highlights, removing any details and turning into pure white.
@@tnykuuh Ah in that case it does not work as intended yet because this doesn't happen with the RTX HDR Video which this is based on. Unless this is a screen issue? Do those details come back if you turn on Dynamic Tone Mapping?
The most impressive thing is how good Killzone 2 still looks. Can we get a PS5/PC release? It still feels a little wonky on RPCS3, unless you have some settings suggestions.
This video made me learn my phone's screen is HDR, and I now get the hype. Some of these games look utterly amazing. I really can't wait to eventually get a high quality HDR monitor to experience this.
Finally a HDR Video from DF! Can we please get more of those? For example i really appreciate the „best HDR Games“ video. Why is not possible to Upload Most of the content in HDR? This would be awesome!
it already used HDR, I guess Killzone 3 did too. also Uncharted 2, 3 and The Last of Us. and pretty sure some 3rd parties from 7th gen were already trying to implement it on consoles.
@@mbsfaridi that's right. many games already used HDR on their rendering. I couldn't find any proof on the Killzone games but I know since it's documented for Naughty Dog games since Uncharted 2 (they also talked about it on GDC 2010). I ended up searching a bit (at least on 1st parties) and Gran Turismo 5 and 6 already used HDR.
Just like with the video part of the feature, I don't really see the point of using this. All it does is "stretch the histogram", so to speak, so that it's not limited to the SDR range, but instead stretches to the entire HDR brightness range. It doesn't add anything visually. The reason why it seems to work better and why it seems to reduce the banding in shadows in Control compared to Auto HDR is that it doesn't seem to stretch the output linearly like Auto HDR seems to be doing, but applies some kind of contrast curve so the brightening is more focused on the midtones and highlights. You could achieve pretty much the same effect just setting your monitor brightness to max and applying a curve via something like ReShade. It's an OK enhancement feature to add to your drivers I guess, but certainly nothing groundbreaking, and, more importantly, it changes the original intent of the presentation by making the image look more contrasty than it was supposed to. (And don't get me started on all the AI/machine learning nVidia talk, aka using buzzwords to make your stuff seem way more complex than it actually is).
Thank you so much for this video! I did not know about this feature and mod at all. Getting the games to render in a wider colour space is probably my favourite part of it!
Reshade can do true HDR injection like this but with far more options, black floor fix features for correct contrast and black levels, finetuning of both paper white brightness and peak brightness, better/more HDR colors, all with less gpu overhead.
Aye, it is superior but it cannot be used in competitive multiplayer games and the game has to render in HDR internally for it to function. Not a problem in modern games as most of them do just that already but it was not yet a thing in more classic games that were made before Half Life 2.
yes but the point is that if nvidia decide to use and implement this, it means we can have better auto hdr at the driver level without the need to download a third party program.
@@alumlovescake no, you're thinking of fake autohdr. Reshade HDR as OP mentions uses addons (new feature) to add in (heh) extra layers and introduce proper AutoHDR. The analysis tools he mentioned are available as of now on the reshade installer by "lillium HDR analysis" shader packet and it's absolutely game changer. It analyses native HDR implementations so you can fix problems like raised black levels, finally making PC HDR as good as or betten than console.
Nice i wanna use this on RPCS3 since AUTO HDR does not work on RPCS3. Holy shit i just tested it with Sly 3 in 4K 60FPS HDR on my LG OLED, this software is great !!
@@lizardking3268 Hi, the .EXE from emulator itself at first it’s not recognising it, and it says I need to add the game to Nvidia control panel, so I add the game there after that it find the.exe and it works.
gotta say after getting a FO48U monitor and finally calibrating the hdr properly has completely transformed my games and have bene going back through my old library, the assassin creed games as bad as they are the graphics and visuals are absolutely insane and worth a revisit
Now this is a case where I really hope AMD figures somethiing out (or Valve) having decent hdr conversions on a steam deck oled would be a great addition.
... they don't really need to. Alex doesn't really cover it... but you can already force AutoHDR in similar ways, using some of the same tools already.
Improvement for future vids: For the comparison of the dark areas in Control, it would be clearer if you could reduce how bright/white the labels are at the bottom of the screen. As it's HDR, I found myself covering the labels with my hand held up so that my eyes could adjust to the dark and see the details/banding better. Otherwise, this is a fantastic video.
i will never understand people who like overblown highlights and thus missing details in games, in photography this is something you want to avoid at all costs.
Yes, it is. I have no idea what's going on here but this is the exact opposite of what HDR should do for the most part. I have a QD Oled monitor, I've tried RTX HDR and I love it for the games I play. It looks better than Auto HDR because of the correct 2.2 gamma (no raised blacks) and debanding. However, the examples Alex shows here are really bad examples with oversaturation and overexposed, blown out highlight with no visible detail is just horrible. I had to check if there's something wrong on my end but no. I think Alex needs to educate himself on this a bit. I don't believe this is a good video.
Very good video! But I do not like the conversion in Lost Planet. The large uniform areas of maximum brightness hurt my eyes. For me HDR is at its best, when you can see detail, which is not possible in SDR, or in highlights with high contrast.
yeah I don't know why Alex is liking this. It completely ruins any bright and white color. It remove all detail in the highlights and just turn them into pure white.
It's precisely because of their cutting edge software/hardware combo that their prices are what they are. Especially when AMD is still very behind on the software features. I for one think Nvidia is absolutely worth it, although I appreciate the pricing backlash they face as lower prices are obviously better for everyone.
That is because when bloom became a thing many old games went way overboard with the effect and caused them to clip and hide details. If there are no details then RTX HDR cannot show them because it is just a post processing effect. But RTX HDR itself should not cause any clipping itself, it just makes those clips more dramatic looking, for good and bad.
Just so people who don't own RTX cards know there is also a mod for Windows Auto HDR called LEDOGE AUTO HDR that does the same thing as this RTX mod. I haven't tried it out on DX9-DX10 games yet but I've used it on emulators and have been playing Tears of the Kingdom in HDR with it. I've got an RTX card so I might try out the RTX version on some games but so far the windows one has been great.
I bet the majority of people don't even have a genuine HDR-capable display. Just lowly HDR100, HDR400, HDR600 etc... The state of HDR product labelling is abysmal.
I just got a CRT for the first time since I was a kid and tested it the other day, and I was shocked by the dynamic range. The brightness levels and contrast are insane on CRT’s. IMO, giving older games HDR mods is kinda just restoring an aspect of how they originally looked.
Sort of. There was more dynamic range on screen on CRTs but the color gamut of the content was still SDR. The reason there was more visible dynamic range on CRTs was because they had deeper blacks and their highlights bloomed out. In terms of color gradation modern HDR displays are much more capable.
6:15 its crazy what happened with blur, considering playing that in splitscreen is basically the best experience you can have with a buddy sitting on a couch. its a brilliant game.
Sometimes I feel like that a lot of features Nvidia silently drops in the patch notes are more usefull than entire marketed features that AMD releases.
AMD is stuck on a predicament: Do hardware accelerated features that are better than what they currently have and lose support of the open source crowd OR continue doing hardware agnostic solutions that will never be as good as hardware accelerated ones but works on pretty much everything. We know Nvidia just does everything HWA/proprietary and people see them as "evil" for that but they are at the top of their game so they don't really care, AMD can't really afford that.
Sony and Microsoft should add an overlay for image options for games, like a kind of Reshade that allows the user to correct the image the way they like it, because some games now lack even a simple brightness and contrast options..
Nice, can't wait for it to release as a full-fledged feature! Would be nice to see comparisons with SpecialK as well. Will have to test this for a few games on my channel.
@@SilverBld there’s no constant change in brightness with DTM. Sounds like you didn’t disable AI Brightness and power saving settings. If you don’t disable those two then you will have what you’re describing. DTM alone does not adjust brightness on the fly
@SolidBoss7 everything is disabled but sometimes you can notice the brightness gets higher in bright scenes slowly What are your settings? I'm using the C1 maybe I'm doing something wrong on my end
I'm surprised you didn't try Mirror's Edge, that to me seems the perfect candidate to try with HDR. A few months ago I managed to get it working in Special K, and it legitimately looks gorgeous when the bright and aggressive nature of the world gets to really feel bright.
An update for anyone curious! Nvidia has officially released RTX HDR with all of the features Alex was hoping for in this video. The saturation boost is separate from the grey brightness, paper white settings. Works great. Beautiful new feature.
I would have loved to see how this fares to the Kaldaien mod for HDR you covered some time ago. It has evolved tremendously, but compatibility can be hit or miss and you are limited by the API, so this seems to be a godsend.
I'm pretty much convinced that the video creator doesn't understand what HDR is and what it's about. Only thing he seems to care about is bright highlights that drown out detail (which is pretty much HDR done wrong).
Great Video! Two questions: Isn't this functionally very similar to what Microsoft did with old Xbox Games running on the Xbox Series consoles dubbed "Auto-HDR", as you've mentioned? And if it is so similar, what keeps AMD from basically capitalizing on this technology and for once have an advantage over Nvidia? Better said, why did AMD not already start something like this in 2020? Second question: In regards to the Switch successor: Frustratingly, the nintendo switch only outputs SDR (although the Tegra chip is capable of HDR but that's another Story) and thus, on some OLED TVs SDR looks significantly worse, esp. on LG Oleds. If Nintendo were to implement backwards compatibility on the switch 2 (presumably hdr capable), could we in theory see the entire Switch library "patched up" to having an hdr output on the switch? Maybe even NSO classic library games? The chip coming from Nvidia should probably make this at least somewhat realistic. What do you think?
This one specifically runs on tensor cores of Nvidia GPUs which is separate from the graphic cores. AMD does everything in their compute cores, so it's up to AMD if they want to design dedicated cores for ML/AI workflows in their future products but it will drive the costs of their cards.
@@kevinbrodackiThank you! Indeed I wasn't aware. Reading again about the whole topic I see that I've been slightly out of the loop on HDR. For a lot of more casual gamers I think this is still somewhat of a niche because of the many factors that need to be assessed to get a "proper" hdr experience and I guess many just don't bother and therefore don't make it a huge topic in the gaming community. I hope that the second part of my question is still relevant though.
@@paul2609 RX 6000/7000 GPUs have dedicated cores for ML, they're simply not in a separate cluster, but they're there and they do not concurr with shaders to execute any more than the tensore cores in Nvidia`s cards for models execution. There is even more...things liker upscaling and auto-HDR don't even need AI at all to be good and not that intensive to compute, so the shortcomings of AMD solution would be shown for much more complex things. But still, the 7900XTX has 192 AI accelerators...so it's not a matter of card capability, it is a matter of Software. AMD planned to run with Microsoft and DirectML APIs adoption, and what they got in return is just another stall, as everything Microsoft/Windows since the Vista era...
HDR is the single most game changing feature for me in modern games. Like, it adds SO MUCH to the experience... it's way above ray tracing or anything like that, for my taste of course. I'll definitely give this a try, thanks for the vid ;)
Agreed, I'm a huge RTX enthusiast, but i'll take HDR over Raytracing any day. Generally I don't even buy modern games if they don't support it anymore lol. Cause it's really fucking annoying to play a dim game.
I mean it's a nice gimmick for retro gaming or emulation, but for the best games it works on, that use HDR internally, you really want to use SpecialK. It costs less performance and looks far more natural that RTX HDR (at least in game engines that properly do HDR internally). Sure, it's a bit harder to use, but once you have it figured out, it pretty much just works. At the end exposing the internal HDR will almost always look better that trying to recreate it from the SDR (and through the remastering it can even reduce banding in game where it works).
Special K isnt a SDR to HDR conversion its real HDR Output and intersects the framebuffer. Doesnt work on everything and only on games that are rendering in HDR internal and then not even in DX9/10 Games
@@AngryApple RTX Video HDR doesn't support everything, it only can do DX9-12. Special K HDR can do DX11-12, OpenGL and it could be used for DX9 games (and possibly lower) with stuff like dxVodoo and DXVK
@@bigworm150 when the game doesnt render in HDR you will not output any HDR. SpecialK uses the actual internal HDR data, AutoHDR amd RTX HDR does not its SDR Upsampling. SpecialK HDR looks amazing when the game supports it, Spyro Reignited was a blast in HDR with it.
SpecialK is obviously superior but it also has a downside of being a hack that injects itself into the game. This practically prevents it to be used in any competitive multiplayer game. RTX HDR on the other hand should be OK as it is a driver feature. No different than forcing FXAA on or something.
It's a testament to how sloppy HDR standards are that most people watching this see a blown-out mess. Why yes, I'd love to invest in a feature that randomly makes things look like crap unless I hand tweak every single program on my PC, sign me up!
So I tried it and was blown away in games like star wars battlefront 2. that looks incredible. But i noticed you have to tweak settings such as contrast and exposure to get the right look for each game. Dynamic exposure i have on 10 and it provides me deep blacks and brigh highlights. looks fantastic with that sense of depth you get from hdr compared to sdr. Make sure you turn up the greypoint brightness. its amazing.
I can count the good HDR implementations on one hand, and this is none of them. "Allow me to take these games - that look absolutely fantastic on a well calibrated sRGB display with well controlled and highly adjustable brightness - and turn them into overly-saturated, overly-bright, eyeball-searing, display-burning silhouettes of their former selves."
All I see is horrible highlight clipping. Lost planet was awful, same with Blur and Half Life. In fact every game shown here is really bad. HDR isn’t about blind you to death. Control has awesome HDR mod, which fixes most issues. Also, for many games you can use SpecialK with HDR injection and with proper settings it’s better then AutoHDR and I assume it’s better then rtxHDR too
Awesome tech. Currently playing Mafia 3 with it on Very High and it’s awe inspiring. Also played Condemned Criminal Origins as well and it felt as atmospheric as when I first played it
Shouldn't we (or the tech itself) be able to bypass/disable the built-in eye adaptation, cause many games internally work with HDR principles. And having both fake HDR on SDR eye adaptation with real HDR output counteracts the entire point. Hell, why can't we just directly tunnel the internal HDR calculations/lighting to an HDR display, disabling eye adaptation, no "machine learning" or "AI" needed?
The HDR shown here generally looks poor, especially when compared against native HDR that has game visuals designed around and for the tech. Native HDR done at it's highest potential actually does the opposite of what's shown here, it generally tones down the graphics using the extra colour space while still allowing comparatively higher peaks and troughs as such. This video and mod at best is a demonstration of tonemap conversion without internal rendering compensation / redesign, and consequently just looks blown out and lacking detail just as poorly and limited as native SDR does. I upgraded to Windows11 purely for the AutoHDR (and HDR calibration) and that just sucks too by being another unoptimised tonemapper. The only actual benefit I see from this mod / feature is to allow for running in HDR mode on your TV / display, which could well essentially unlock it's peak / best performing display mode, but it will still be inferior to the point of being almost useless without native artistic support.
I always feel compelled to play old games on Steamdeck, for the novelty. I just hope this Nvidia tech becomes much more ubiquitous like upscaling eventually did.
hello Alex, did you know in autoHDR you can actually adjust the intensity of the white highlights via an option in gamebar? This fixed it for me in many games.