💻 *PC Specs - Buy on Amazon / Newegg* Zotac RTX 4070 SUPER Trinity Black - geni.us/vBZBD7 Zotac RTX 4070 Ti SUPER Trinity Black - geni.us/9idrc Zotac RTX 4080 SUPER Trinity Black - geni.us/U8to Intel Core i9-14900K - geni.us/qYpgQeA ASUS TUF Gaming Z690-Plus - geni.us/Q7g3 G.Skill Trident Z5 2x16GB, 6000 CL36-36-36-96 - geni.us/dznN Streacom BC1 Open Benchtable - geni.us/9PgRsf8 Custom water cooled (CPU) - MO-RA 360, EK-Quantum Velocity Full Nickel Setup: bit.ly/3H117uA *GPU-Z Validation:* 4080 SUPER - www.techpowerup.com/gpuz/details/2ws9g 4070 Ti SUPER - www.techpowerup.com/gpuz/details/3gz6k 4070 SUPER - www.techpowerup.com/gpuz/details/bwv8m *CPU-Z Validation:* 4080 SUPER - valid.x86.fr/4qn6s2 4070 Ti SUPER - valid.x86.fr/93bxbi 4070 SUPER - valid.x86.fr/aisvpc 🎮 *Games* 00:00 The Last of Us Part 1 00:44 Starfield 01:24 Avatar: Frontiers of Pandora 02:02 Cyberpunk 2077 02:45 Cyberpunk 2077 | Ray Tracing + DLSS Quality 03:28 Remnant 2 04:09 Alan Wake 2 | Ray Tracing + DLSS Quality 04:50 Ratchet & Clank Rift Apart | Ray Tracing 05:27 RoboCop: Rogue City 06:12 Forza Motorsport | Ray Tracing 06:45 Call of Duty Warzone 3 07:30 A Plague Tale Requiem | Ray Tracing 08:21 Star Wars Jedi Survivor | Ray Tracing 09:02 Immortals of Aveum Intel CPU: PL1 & PL2 disabled, stock GPU: rBAR enabled, stock OS: Windows 11 🎥 Recorded with dual PC Setup (No FPS loss, but screen tearing..) AVerMedia Live Gamer 4K, GC573 - amzn.to/39ODnJn 📊 Overlay - MSI Afterburner, Rivatuner, HWiNFO Frametime graph: 0 - 50ms
My goodness! You need an RTX 4080 just for 1440p these days! Look at Starfield, Avatar, Immortals of Aveum, and Alan Wake 2. Just to keep the 1% Low above 60 in these games at 1440p, you need a $1,000 graphics card.
not really. dlss and frame gen helps out tremendously. with frame gen and dlss on quality i run about 100 fps everywhere on cp2077 with a 4070ti. Could probably get away with a 4060 tbh
Keep in mind most of these benchmarks are running these games at ultra settings, so its not a clear cut as you would think. Yes, the 4080 is getting the most possible FPS in all of these examples. But factor in any tweaks to the quality settings that really do not benefit the gameplay experience overall? And Im referring to settings where you literally WILL. NOT. NOTICE. THE. DIFFERENCE. IN. VISUALS. Then you have to compare the price points to the performance. Is 24 - 40 extra frames really worth an additional $500? If the difference were say 60+ frames between the 4070S and the 4080S then I would still say probably not.
Naw, when I had my TUF 3080, with optimization and DLSS, it perform very well with decent FPS in all those titles at 1440p. I ended up sniping a TUF 4080 Super now
Actually there is 14 games tested 4070 super: 981total frames / 14 games= 70 average frames for $650 = $9.28per frame 4070ti super: 1,118 total frames / 14 games = 80 average frames for $800 = $10 per frame 4080 super: 1,321 total frames / 14 games = 94 average frames for $1000 = $10.64 per frame 4070 super is best value dollar per frame
in my country 4070S is twice cheaper than 4080S - i am asking, if ~30% more FPS is cost 2x more money? Answer is - no)))) It's not. 4070s kinda best right now.
@@otravoyadnoe Frame gen is awsome.. I USE IT ON ALL GAMES IT FEALS SO FAST TO PLAY.... not slower... I USE CB FRAME GEN THE FINALS STARFIELD EVERYWHERE I DO NOT NEED+ TO TEST IT I HAVE 4090 i am happy WITH FG FUTURE FREEEEEEEEEEEEEEEEEEEEE FPS FREEEEEEEEEEEEEEEEEEEEEEEEE.
@MakeSh00t There are two big problems I have with upscaling and frame gen. One is that they use these software technologies as the main selling point of their GPUs to try and justify cutting corners on the hardware. The second reason is that developers sometimes use them as a crutch to not optimise some of their games properly. Instead, they should dedicate more die space for CUDA Cores and RT Cores to result in bigger performance increases in rasterization and ray tracing. Gamers would certainly appreciate that and prefer that instead of just getting lackluster downgraded hardware with frame gen and upscaling. Bigger performance increases in rasterization and ray tracing would allow developers to push the graphics of rasterization and ray tracing to the next level. Obviously, with more cores and ROPS, it will require more bandwidth to effectively leverage those cores. GDDR7 should be used for the high end and enthusiast class RTX 50 Series GPUs, and that should massively increase performance at high resolutions such as 4K and should also help with RT performance. All of those hardware upgrades to increase raw performance would be way better than shitty frame gen and upscaling.
Everyone seems to be impressed by a gimmick that interpolates an artificial frame between two existing real frames, which are the previous frame and the upcoming frame, which causes latency. And sometimes artifacts.
They are 16.7% faster or slower than each other on average. But the price difference between the 4070 Super and 4070 Ti Super is $250/€250. That is 33.33% more expensive for a 16.7% performance increase. Between the 4070 Ti Super and 4080 Super there's a $200/€200 price difference. That is 20% more expensive for 16.7- 20% performance increase. The best value is the 4070 Super and 4080 Super
@@xPhantomxify I dont even think the 4080 Super fits into "best value" at all, its an extra $500 for a neglible 25 - 40 additional frames if you're lucky enough. 4070 S is the only really valuable card.
This is amazing! Seeing the performance differences between each GPU is incredibly informative [or "eye-opening"]. I've been going back and forth between your videos and had a question for you (if that's alright). I noticed you're using an Intel Core i9-14900K with a fantastic motherboard. In a lot of benchmark videos I've seen, there are many with similar high-end Intel setups. Given your expertise, what's your personal take on Intel vs. AMD CPUs for gaming?
If this video and others have convinved me of anything, its that I do not need a 4080 Super no matter how much I convince myself that I do, not matter how much anyone says its the best thing outside of the 4090. On average the FPS difference between the 4070 S to the 4080 S is 25 - 40 frames? Ill save $500 and just get the 4070 Super LOL. Thanks for the help with that decision!
I know how you feel. I keep doing mental gymnastics in my head trying to justify the 4080 super price but honestly I can’t. 4070 super is the only one that makes sense and is a hell of a lot cheaper
I’m new to pc building so correct me if I’m wrong. But it looks like he did something to his cpu to be utilized less, so the frames would be multiplied if the cpu is bottlenecking the GPUs.
if the rumors are true that the 5070 next year will be on par with 4080 performance, if it keeps the $599 price point, it's gonna be one of the best price-to-perf card in that new generation. only problem would probably be still 12gb Vram
Thank you for this very informative. But incl some DLSS frames data would have been immensely helpful to many also. I mean who wouldn't be enabling that if they can in game ?
Remember when new released Graphic Cards got you 100 fps more in games ... like you would normally have 100 fps min and 200 fps average on new graphic card releases , but now they release the same crap over and over for 5-10 fps more wtf is this mate stop buying graphic cards at all let them suffer they can release way stronger GPU way more and cheaper
@@SpeejHere What are you talking about? You can see the CPU usage in the video. In most of them, it was 30% or lower across the board, and in the ones that were over 30%, it never went above 50%.
@@Stardomplay It's possible, but GPU usage is maxed out on every game at every resolution. It's true that the CPU can bottleneck a system by not being able to supply the relevant data to the GPU while still not constantly being at 100% utilization (though rare from everything I've ever read), but if that's what was happening, then the GPU wouldn't be at 100% usage. It would be waiting on the CPU and always be below 100% utilization.
@@Balthazzarr Rare? What? Who did you hear that from (I doubt you heard that from anyone)? Most games Will not utilize more than 12 threads and 4 to 6 cores which is why you often NEVER see utilization at 100 percent on 8 or even 6 core chips. You seem very green to understanding how PC hardware works.