In Poland, a new 7900 xtx is twice as cheap as a used 4090. New ones can be 3x more expensive. It's hard to believe. It's sad that AMD is giving up on high-performance GPUs in the next generation.
Bro, really? RX 8800xt will be amazing with rtx 4080 performance while being nearly 2x cheaper. Its enough for one gen break from high end. RX9000 will have high end. Also RX8000 gpus will get fsr4.
@@MacTavish9619 What do you mean 4080 performance bro? 4080 is on par with 7900XTX across the games. Why not say that it will be on par with the current AMD best card instead?
@@someperson1829 I guess it's possible that it shall be pretty much qual to 4080 on all fronts and that includes Raytracing and Pathtracing where 4080 usually if not always absolutely crushes 7900XTX. Highly unlikely , but possible.
@@arenzricodexd4409 Yeah, yeah. They're saying that for about 5 years now xD Also 8800 XT will have far better RT support than RDNA 3 and 2, so stop yapping.
@@fallenlegion1828 He's referring to the deeper color contrast compared to the 4090. It is slight, but there is a difference. AMD has always had slightly better image quality, not exactly sure how, but I have noticed it over the years. Might have to do with their pipeline layout, though I'm just speculating at this point.
Frame gen is worth using in single player games for lower temps. Just need to cap the fps to 4K 120Hz. No reason to cook your silicon and make the Air-conditioner come on but that's just my opinion as an avid AFMF2 enjoyer.
@@quintrapnell3605 I've just been using DLSS Quality at 4K with 120hz vsync on the OLED TV. GPU pulls around 260 watts CPU 60, total system power is around 350ish
I'm playing with an XFX 7900 XTX, 7950X3D, 64BG DDR6000 CL 30, etc., and the experience is outstanding on my OLED with some surround/Atmos headphones or the surround sound. It's well over 60fps at native 4k completely maxed out on my OLED/TV and sits around the Max of 144Hz on my 32in 165Hz center monitor. Both ways of playing are just phenomenal as there are no interpolations, extrapolations, interpretations, etc., going on. Every pixel is actually rendered for each unique frame. I love it. Highly recommend this hardware combination for a very enjoyable gaming experience
I don't know, but unless you're getting CPU bottlenecked, or unless the frame-rate is super high even with no upscaling, you should pretty much always turn upscaling on when running at 4k.
Hard to say, they both shimmer when moving the camera. FSR is a bit blurrier so it probably works to its advantage in this particular case. You can see in the conclusion around 10:50 if you pause and look at the work artistry in the office, DLSS has a bit higher detail but FSR does pretty well too.
@@OmnianMIU I had a hard time trying to see any differences, im pretty sure there are, as DLSS is better, but even at 4k 55 inch screeen i only noticed the water shimmering. usually shimering is the easiest difference to note on comparisions, or the rest of details is harder at least or me.
@@Chasm9 Does it? Seen other benchmarker who was livestreaming benchmarking 4090 with this game and nothing happened. His name is Santiago Santiago . He even tests it at 8K.
hi my friend!!very complet video good one thanks we know every things in eatch option !!i have 7900xtx sure i will ebjoy the game !!is it CPu heavy game or will work good in my 5800x3d ?have great weekend and always glad for u
@Ivan80054 if you don't care about DLSS vs and raytracing performance, despite it starting to be implemented in every new game as a standard, just go with the XTX. Otherwise, wait a few months for the 8800XT to future proof the RDNA4 raytracing standard in console games.
@@ehenningsen That is what i wait, to see new serie of AMD cards. My biggest hope is on 8800XT or how ever AMD will call it. FSR 3.1 is not bad at all but i want to see new RDNA 4 capabilities.
Hey bro, wanted to know your opinion on how FSR fares vs DLSS in this game? Also in general is FSR close enough to DLSS in most games? I am choosing between Nitro 7900XTX and TUF 4080 Super and the XTX seems a bit faster in general but not sure about giving up DLSS. Also how is the performance with the new AMD preview drivers for Ragnarok?
Thanks for this comparison @TerraWare. Can you keep the same placement of names (in title), thumbnails and on screen footage consistent in the futrure? rn you have 4090 on the left for the title and thumbnail but the footage is on the right. This can be confusing
Yeah that bothered me too, I created the video first as always. Had the 4090 on the right initially in the thumbnail but didn't like how the RTX and XTX in the names lined up so I swapped them. Will have to plan it out a bit better next time. I value symmetry if I can when making thumbnails.
Uuuuum what?? How did this fly over my head lol, had no idea Ragnarok was coming to pc XD. I'm a huge fan of the older games and I liked the storytelling approach on the new one tbh, hope Ragnarok is fun as well cause I'm getting this puppy. Fsr does look amazing here btw, CDPR take notes smh. Thanks for showing us man, looking good!!
It's pretty good although I felt the story could've been better in the sequel the gameplay does expand, so do the many areas you visit and maybe you'll like the story more than I. I
@@TerraWare Yea I heard that the story is kinda mediocre at best which really sucks... Just finished installing it and if it's as bad as they say I'll just refund it after a couple hours. Steam ftw lol. I'll be honest though I wasn't THAT impressed by the first one either in terms of storytelling. Had some really annoying moments (looking at you Atreus) but I just found the new take refreshing. I don't expect much from games such as these, just a chill time with great visuals and acceptable storytelling and plenty of hot chocolate on the side to enjoy them with. You know, a cinematic experience. If I want deep lore and combat there are games which are way more proficient at offering both at the same time but you know, gotta have myself some cinematic chill from time to time :P Hopefully I won't have to refund it
bought my rx 6800 last weekend,downloaded ragnarok yesterday and started playing this mornintg until now.....definately nooooooo complaints..wot a game
Forget frame generation and dlss, just put it on TAA and quality, which is a resolution scale of 80. I struggle to find any visual difference on my 32 inch oled
440 watts for a GPU. No thanks. Bill of that much power consumption will be too much. I am better off with my 3070 based laptop. It gives very good quality gameplay at 1440p with 150 total consumption of laptop.. That's acceptable power consumption. 600-700 watts power consumption is just too much..
the difference in the amount of used vram is insane between nvidia and amd in this game, the 4090 barely wants to use its vram while amd goes full throttle, not like the 4090 doesnt have enought
I would be more interested to see a 4080 or 4080s vs a 7900XTX. Everyone knows AMD didn't release the XTX to compete with the 4090. Ironically VD shows a 7950XTX listed in the Godlike preset, perhaps that was meant to be the refresh to take the crown like the 6950 did last generation?
DLSS frame gen works fine on my end. Not sure if it's a widespread issue as you hinted, then. Also, have you had a chance to try the HDR in this game?? It's absolutely fantastic!! Supports HDR system level calibration, has NO black level raise, and supports wide color gamut BT2020. It's otherworldly
DLSS FG not working is pretty wide spread. Made a short about it and lots of comments people saying the same, my buddy over at Mostly Positive Reviews couldnt get it to work in his review on a 4070 Super either. I did test HDR and mentioned it in my previous video. Im not fit to test HDR with tools but I said it looked fantastic so am not surprised.
fyi: the game does not use rt, it uses planar reflections, duplicating the geom below the water with diffuse, compared to rt reflections of the same calibre its about 8x faster
Btw theres something odd with FSR FG with nvidia GPUs here... I noticed he gained only 30fps on the 4090 and wanted to test it out myself with my 4080 4k DLAA medium preset FG OFF - 110fps With FG ON - 140fps 4k DLAA ultra preset FG OFF - 95fps With FG ON - 125fps It IS always +30fps😂😂
@joselejos You're right, I've been playing around with DLSS too it is 30 FPS. I tested this game on my 6800XT cpu bound with a Ryzen 3700X at 70 fps. Turning on FSR-FG goes up as high as 140
DLSS frame gen worked fine for me until I got to the first realm you travel to, Then one area simply turned it off... Also, Reflex and HDR keeps turning off every restart of the game...
I've seen some comments people saying DLSS FG works for them but the overwhelming majority of comments claim DLSS FG doesnt work for them, including Mostly Positive Reviews video on the game. Havent had any issues with HDR personally. It looks fantastic with HDR
Really sucks that AMD isn’t making high end cards anymore. Not saying it would compete with the 5090 but it would’ve been the best price to performance card like the 7900xtx is.
Their best will still probably beat 4090 without problems in raster. And if priced right with massively improved RT & PT performance it could be quite popular among well informed enthusiasts and semienthusiast with relatily limited budgets. I doubt 4090s will drop in price significantly when 5000 series is finaly unleashed.
Maybe I should've but based of what I have seen and can see in this video if you pause DLSS reconstructs the fine detail a bit better but FSR looks great too. Better than in other games imo.
4090 consumes almost 75 watts less for 25% more performance. No wonder AMD does not want to compete with 5090. It will need 200 watts more power making it a 700 to 800 watt card and would still lose in RT, CUDA, Upscaling quality etc. I also feel like the the CPU is a bit of a bottleneck when DLSS is enabled.
@@kiranplays2688 You dont have to buy the 4090 to crush the 7900xtx, just buy the 4080s for the same price and better in almost all aspects except vram.
So you mention power efficiency and performance difference but not the price difference, which is almost 50% between AMD and Ngreedia. Kind of a disservice towards us gamers Also, even AMD themselves said it many times, 790XTX competes with the 4080, but who cares, right?
@grtitann7425 I said all this in my video the price and everything. It is a 4080 super competitor and it would perform very similar to a 4080 Super in this game at a similar price. With 4080 being more power efficient, better upscaling, RT and all that.
It's pretty close to the 4080 Super. That's not really unusual. On average the XTX is a bit faster in raster but not by much. I'd consider the 4080 Super and XTX to be equivalent GPU's in rasterization personally.
Please do not overclock your RTX 4090. Because if your RTX 4090 with OC version is 2565 MHz core as stock (means +0 MHz), this would get real boost typically 2760 or 2775MHz. While my RTX 4090 non-OC version is 2520 MHz core (means +0 MHz) and gets real boost typically 2715 or 2730 MHz. 😉
It depends on the card and temps. My 4090 will boost to 2955mhz and during the winter it will boost to 3050mhz. This is w/ power limits at 90% no undervolt or OC.
It doesn't make much of a difference anyway OC or no OC its around 5 to 10% advantage at best. The 7900XTX can see some nice gains up to 15% I have found in some cases.
nvidia and amd work diferently when it comes to clocks, you need to measure power draw and utilization to get an estimate if the gpu is matching full utilization, it's weird, but its why you can get the 4090 sometimes be below 300W while still boosting at maximum clocks.
I look at the Watts of both cards an start to cry when i see the AMD 7900 XTX number.. its aroun 60-100 Watts more. That is so bad. I would buy a 4070 Ti Super or a 4080 instead of the 4090. With that poor power management, never AMD.
AMD’s motto seems to be 'the past is the future.' Anyone buying AMD GPUs really needs to sharpen their critical thinking skills. With their low-quality GPUs, AMD will never get my money, no matter how much exposure I have to their products.
@@hornantuutti5157 I'm not referring to poor quality in terms of weak construction or performance, but in recent years, two key features have been introduced that are truly game-changing: ray tracing and AI rendering. Ray tracing is almost as revolutionary as the shift from 2D games to 3D polygons, and AI rendering is what makes it all possible. However, AMD's marketing often pushes the narrative that traditional rasterization techniques are still the best, all while they quietly work to close the gap on these new technologies. Worse, they do this behind the scenes, using paid RU-vidrs to spread misinformation, which doesn't seem fair. Intel, for example, doesn't pay 'reviewers' to do this kind of thing. In my opinion, it's time to properly test AMD's RX 7000 series against NVIDIA's cards, especially since they now have dedicated hardware for ray tracing and AI. There's no reason to avoid direct comparisons of how both companies handle these newer technologies. This video does make a fair comparison, though .
holy crap that power efficiency on the 7900XTX is crazy! 460W @ 88 FPS vs 400W @ 108 FPS. 5.2W/fps vs 3.7W/fps. That's a whopping 40% more power efficient!!
@@TerraWare both cards are overclocked, yes? Then the power efficiency comparison I made stands. Overclocked, the 4090 is 40% more power efficient than the 7900 XTX.
My 7900 xtx was constantly crashing due to "driver timeout" but once I uninstalled AMD adrenaline software I havent had one crash. I cant believe AMDs own software was causing this... I regret not getting Nvidia.
Imagine spending 2x more for 4090 to get 20% more fps xD Also xtx from this video is weird af. Even 7900xtx sapphire nitro + wont use more than 408-420W and his is using even 480W HOW. This is impossible.
@@MacTavish9619 This is what people ignore. I would literally have to spend $1000 more for a 4090 for an extra 20% that is not a good value for money. My XFX magnetic air 7900xtx uses about 390 watts.