Тёмный

Will RTX 5090 Be Too Fast For Any Current CPU? 

DF Clips
Подписаться 68 тыс.
Просмотров 53 тыс.
50% 1

► Watch the FULL Video: • DF Direct Weekly #159:...
► Support us on Patreon! bit.ly/3jEGjvx
► Digital Foundry RU-vid: / digitalfoundry
► Digital Foundry Merch: store.digitalfoundry.net
► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
► Follow on X/Twitter: / digitalfoundry

Опубликовано:

 

22 апр 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 675   
@UnimportantAcc
@UnimportantAcc Месяц назад
Just replace the CPU with a GPU silly 🥰🥰
@tsorakin
@tsorakin Месяц назад
Like duh 😂
@vitordelima
@vitordelima Месяц назад
This is kind of possible but it takes too long to explain. IBM Cell was one failed attempt at it.
@jmssun
@jmssun Месяц назад
The more you buy, the more you save ~
@daniil3815
@daniil3815 Месяц назад
exactly. you have money for 5090, but not for CPU upgrade lol
@photonboy999
@photonboy999 Месяц назад
@@daniil3815 I think you missed the joke. Anyway, I remember Intel trying to go the other way with Larrabee trying to use stripped-down x86 cores to create a GPU. It was interesting, but predictably was very inefficient for GPU tasks. So I'm curious WHY they put the money into it. I'm not going to just assume there was no purpose whatsoever based on my limited computer experience.
@professorJorge11
@professorJorge11 Месяц назад
I need a 5090 for My 1080p monitor, like a fish needs a bicycle
@darkdraconis
@darkdraconis Месяц назад
Hey hey hey hey now! What kind of "fish-racist" are you? Does a fish not have the right to evolve into a bike riding creature? Incredible you anti fishists!
@professorJorge11
@professorJorge11 Месяц назад
@@darkdraconis it's a song bruh
@darkdraconis
@darkdraconis Месяц назад
@@professorJorge11 it's a joke brah
@CeceliPS3
@CeceliPS3 Месяц назад
Let me teach you, professor. There's this thing called DLDSR. You can render games at 1440p and 1620p, mantain high af FPS and still get a much better and crispy clean graphics on your 1080p. Sure, a 4090 user (or even a 5090 user) could do with a 1440p monitor, but your analogy is entirely wrong in this case as there is a use for those GPUs with a 1080p monitor.
@professorJorge11
@professorJorge11 Месяц назад
@@CeceliPS3 I have a Radeon 7600. There's no DLSS, it's FSR2
@leo_stanek
@leo_stanek Месяц назад
What a world we live in where we are concerned that our GPUs have gotten so good we bottleneck the CPU at 4K high refresh rate. I remember when getting 1080p60 was the dream for high end hardware.
@stephenmeinhold5452
@stephenmeinhold5452 28 дней назад
its still is for me on a 2080ti although with DLSS i can go up to 1440p.
@Veganarchy-Zetetic
@Veganarchy-Zetetic 24 дня назад
@@stephenmeinhold5452 Yh Alan Wake 2 can barely run at 1080P on my 4090 with raytracing on lol.
@UTFapollomarine7409
@UTFapollomarine7409 23 дня назад
my 3900x lacks in 4k in some areas believe it or not
@LordKosmux
@LordKosmux 18 дней назад
​@@Veganarchy-Zetetic You know that this is due to developer's laziness to optimize the game, right?
@Veganarchy-Zetetic
@Veganarchy-Zetetic 17 дней назад
@@LordKosmux I would say it has a lot to do with Ray Tracing.
@byronfranek2706
@byronfranek2706 Месяц назад
32" 4K/240hz OLED displays would be an obvious target for the 5090.
@xpodx
@xpodx Месяц назад
8k 120hz/144hz
@GatsuRage
@GatsuRage Месяц назад
even a 5090 wouldn't be able to push 4k 240fps... unless u only playing cs and LoL lmao so I seriously see no point in looking at those displays yet. 1440p still makes way more sense for high refresh rates.
@mttrashcan-bg1ro
@mttrashcan-bg1ro Месяц назад
That's all well and good, but I've just gotten a 4k 240hz monitor, I have a 4090 and 5900X, and even games from 2015 are CPU bottlenecked, 240hz should never be a target. Making 8k gaming doable seems to be what it'll be targeting, we don't need better GPUs for 4k right now.
Месяц назад
@@GatsuRage Well the more FPS the better. 4K looks way better than 1440P, and OLED is better than any display in the market.
@xpodx
@xpodx Месяц назад
@GatsuRage I get 180-220 in cod Vanguard with my 4090 at.4k max no dlss. But yea the 5090 won't be able to do cyberpunk 2077 max 4k 240 and other similar games. But easier games for sure.
@wickfut8917
@wickfut8917 Месяц назад
VR needs more power. Always. My 4090 isn't good enough for high resolution headsets in graphic intense games. The new headsets on the horizon with 3800x3800 resolution per eye will easily chew through the power of the next few generations of GPUs.
@clockworklegionaire2135
@clockworklegionaire2135 Месяц назад
Real
@mattzun6779
@mattzun6779 Месяц назад
Who in their right mind would develop a game that NEEDS something faster than a 4090 to be good. If VR needs that much power, VR games need to go for several thousand dollars each to make a profit with current tech. One can hope that next gen consoles and NVidia 6000 series mid range cards get to that level Hopefully, there are tricks like frame generation and higher resolution where you are looking that help.
@rahulahl
@rahulahl Месяц назад
@@mattzun6779 Not official games. But the UEVR mod allows you to play non VR UE games in VR mode. Imagine running the latest UE5 games at about 6k resolution, aiming for 90FPS stable. My 3080 couldn`t even run simple games like Talos Principle 2 at a playable quality or frame rate. Best I got was a blurry mess equivalent to 720p at low settings at about 80ish FPS. This is why I am waiting for the 5080/90 so I can finally play those UE games in VR.
@xpodx
@xpodx Месяц назад
Yea and tons of games can do higher render scaling, and the 4090 is not strong enough 4k max 8k Render at 144hz+
@nossy232323
@nossy232323 Месяц назад
@@mattzun6779 I personally would hope games will be scalable enough to use all the power from the lower end up to the ultra high end.
@EmblemParade
@EmblemParade Месяц назад
As a 4K/120 gamer I can promise you that we're still GPU limited with the 4090. I often have to compromise on AAA games by enabling DLSS 3 or lower settings, and sometimes just hit 60 FPS. At the same time, I do think upscaling is changing our requirements and expectations, so I hope the silicon can be optimized around that. We don't necessarily need more pixel shader performance if we assume upscaling. The die space is better spent on other features.
@StarkR3ality
@StarkR3ality Месяц назад
I can think of one title you would be fully GPU limited on that card is Cyberpunk 2077 path tracing mode.. and that game is an outlier and also extremely CPU heavy, I'm CPU bottlenecked on that title in certain areas on a 4070s so I'd get your card looked at because something wrong there.
@lorsch.
@lorsch. Месяц назад
And to max out high end VR headsets these days a 6090 is probably not enough...
@ghostofreality1222
@ghostofreality1222 Месяц назад
@EmblemParade - What CPU and RAM are you running? CPU and RAM has a lot to do with it as well. - but I also agree with @starkr3ality - 4090 @ 4k 120 should be running fine on all AAA games with a couple of exceptions being Cyberpunk or Microsoft Flight Sim - A 4090 should be maxing out all AAA @4Kx120hz. You have got to be hitting a CPU Bottleneck and that is why your having to lower graphics settings to get your desired results. Again this is mostly assumption at this point as I have no idea what CPU or RAM your running but this is what makes sense in my mind with what you stated in your post.
@EmblemParade
@EmblemParade Месяц назад
@@ghostofreality1222 Look at benchmarks from HardwareUnboxed, GamersNexus, and others, and see that you are very far from the mark. I have a 5800X3D and high-end DDR4 RAM. I'm not saying that I'm not having a great time with this setup, but forget about ultra settings AAA at 4K without the help of DLSS. The 4090 is great, but 4K is a lot of pixels.
@blackcaesar8387
@blackcaesar8387 Месяц назад
@@StarkR3ality cyberpunk is no longer an outlier...Alan wake 2 made sure of that. I am guessing helllblade 2 further confirm that.
@markusmitchell8585
@markusmitchell8585 Месяц назад
Diminishing returns at this point. Games are unoptimized that's the only reason why these ridiculously stupid specs are needed.
@saliva1305
@saliva1305 Месяц назад
so true
@justhomas83
@justhomas83 27 дней назад
correct I have a Rtx 4080 i am not upgrading for another 3 years. It's getting pointless
@saliva1305
@saliva1305 27 дней назад
@@justhomas83 i play to get 50 series since im on a 3070ti, but maybe AMD us an option too.. sad we have to buy top tier gpu's to play games that should run on 3080 and no frame gen
@justhomas83
@justhomas83 27 дней назад
@@saliva1305 I feel you I just can't do it anymore. I have more bills coming in now and my daughter is graduating college. I understand though you're right the top tier is the only path now. We are chasing rabbits in Alice Wonderland at this point 😔😔😔 damnit
@Koozwad
@Koozwad 26 дней назад
yes exactly and the fact that RT/PT exists - it's there to make people $pend, $pend and $pend some more amazing graphics are possible using non-RT/PT - just look at RDR2 from what 2018(?) I've been saying for years now that devs should be using RT/PT as a TOOL to see how scenes should be lit and then recreate them with NON-RT means which will give the players a ton of performance and be much friendlier to their wallets plus hand-crafting it can look nicer
@heyguyslolGAMING
@heyguyslolGAMING Месяц назад
I'll only consider the 5090 if it puts my house at risk of burning down. If it can't do that then its not powerful enough.
@ThePlainswalker13
@ThePlainswalker13 Месяц назад
Nvidia Power Plug Engineer: "Hold my half caf soy milk grande carmel macchiato."
@EdNarculus
@EdNarculus Месяц назад
I'm an overheating enthusiast myself and would like to see products that carry risk of spontaneous human combustion.
@murray821
@murray821 Месяц назад
Easy, just put steel wool on it while playing
@chillnspace777
@chillnspace777 Месяц назад
Just get a 14900ks then your good to go
@dieglhix
@dieglhix Месяц назад
it will be more efficient than a 4090, which can powercapped at 70% and running at 98% performance.. meaning a 5090 will be able to run at itsnfull potential at lower than 300w. stock power is too much power already
@mchits9297
@mchits9297 Месяц назад
[It's 2030, goes to buy RTX 9090 with all of my savings.] Me: Hey, you have latest GPU, May be RTX 9090. dealer: (goes inside and comes with a 8 feet server wrack.) here's Your GPU sir, just $100k dollars.
@akam9919
@akam9919 28 дней назад
rah. it'll be a tiny quantum board... but you need to buy a giant freezer sized cooler...and not like the fridge you have at your house...like a giant walk in-freezer. You will also have to pay $500K to turn it on, wait for 2 days to get the thing to cold enough, and then spend $2.3M to run crysis, $2.345 for doom, and $76B for fortnite... no the game will not look more realistic.
@mchits9297
@mchits9297 27 дней назад
@@akam9919 I might create physical sets for all those games with that kinda money 🤑💰
@mchits9297
@mchits9297 27 дней назад
@@akam9919 Virtual reality ❌ Reality ✔️
@GamingXPOfficial
@GamingXPOfficial 23 дня назад
This was somehow very hard to read/understand, but I got it at the end of the day.
@LordKosmux
@LordKosmux 18 дней назад
What if they get smaller instead? A GPU the size of your smartphone. And the price of a house.
@hoverbike
@hoverbike Месяц назад
the 4090 is still hugely gpu limited in games like MSFS2020 in VR, and i expect 2024 to be utterly devastating in VR - and i'm all for it. We get closer and closer to Star Trek computer simulations every year.
@2drealms196
@2drealms196 Месяц назад
Visually yeah a single flagship videocard is getting closer and closer when it comes to rasterization. But the holodeck virtual npc's have GTP-5 or higher level AI, the physics simulations are orders of magnitudes more realistic and computationally demanding, true pathtracing without the need for any denoising, ultra realistic animations. Latency is also an issue with these LLM responses. So you'd need an entire futuristic datacenter's worth of power with futuristic networking that provides exponentially quicker responses, so even a single flagship videocard from 2045 wouldn't be enough.
@numlock9653
@numlock9653 Месяц назад
Actually the cpu is definitely the bottle neck when using a 4090 in MSFS VR, in my experience . I have a 7800x3d and a 4090 using a Vive pro 2 on highest settings and no matter what graphical setting I change it makes little difference in frame rate, but if I lower traffic I get a huge boost, which is all CPU based calculations. Very Poor CPU optimization unfortunately.
@hoverbike
@hoverbike Месяц назад
@@numlock9653 Vive 2 pro Res is very low
@hoverbike
@hoverbike Месяц назад
​@@numlock9653huh, well you must've either clogged up that CPU with poor bios settings, or your Vive 2 Pro just has subpar resolution.
@Myosos
@Myosos 29 дней назад
​@@numlock9653 get a better VR headset
@JamesSmith-sw3nk
@JamesSmith-sw3nk Месяц назад
There is always a bottleneck in a pc depending on the resolution and game settings. Doesn't matter if it's a $400 pc or a $4000 pc.
@ZackSNetwork
@ZackSNetwork Месяц назад
Not exactly if your hardware pars together fine the bottleneck can be the software and not the hardware.
@EspHack
@EspHack Месяц назад
thats why I aim for a monitor bottleneck
@user-rt4ct6tq3r
@user-rt4ct6tq3r Месяц назад
Such a dumb argument, there is a difference between something BARELY hindering performance of another part vs a HUGE hinderance.
@eugkra33
@eugkra33 Месяц назад
Alex said especially with RT it'll be CPU bound, because of the BVH workload. But what the next generation could offer is moving the BVH maintenance to the GPU alleviating a whole bunch of CPU work.
@Hi-levels
@Hi-levels Месяц назад
Npu s will also come to rescure either on gpus or cpus
@yesyes-om1po
@yesyes-om1po Месяц назад
@@Hi-levels I don't think that has anything to do with RT's BVH workload, the only thing an NPU could do better is AI denoising, but nvidia already has dedicated hardware to do that on RT cards, and I'm pretty sure an NPU would introduce too much latency for realtime denoising as a separate piece of hardware.
@BaieDesBaies
@BaieDesBaies Месяц назад
RT is so GPU intensive that I don't see how it could CPU limit games. If I activate RT in games, CPU load tends to lower because GPU is struggling. I have i5 and 3080
@tomthomas3499
@tomthomas3499 Месяц назад
50%, 70%, or even double the power of 4090, as long as it's not melting it's connector it's fine by me
@ZackSNetwork
@ZackSNetwork Месяц назад
I expect %60 faster rasterization performance and 2.5x better raytracing. While pushing the same Power output.
@iansteelmatheson
@iansteelmatheson Месяц назад
@@ZackSNetwork exactly. efficiency is really underrated
@nicane-9966
@nicane-9966 Месяц назад
gonna pack 2 of those filthy ass conectors man lol
@nicane-9966
@nicane-9966 Месяц назад
@@iansteelmatheson is not, is just that those very high tier cards are made to get the maximum amount of power possible, for efficiency you have the 80s and below that.
@aberkae
@aberkae Месяц назад
​@@iansteelmatheson 4N node how much efficency can they squeeze out of a similar node🤔.
@holyknighthodrick5223
@holyknighthodrick5223 Месяц назад
Parallelism is badly needed in many game engines, and modern consumer CPUs really need more cores. Single core performance uplift is too small to keep up anymore. More game engines need to adopt the strategy of running tasks in parallel, instead of just using a render thread, lighting thread, logic thread etc. Easier said than done, but it is the only real way forward.
@chrisguillenart
@chrisguillenart Месяц назад
Ampere didn't have price cuts of any kind, it maintained the price hikes of Turing.
@Torso6131
@Torso6131 Месяц назад
I mean, 4k120 has to be on the table for most games. Even 4k90, throw on some DLAA, call it good. Aside from totally broken PC ports I feel like we're still GPU limited 99% of the time, especially if you have something like a 7800x3D.
@StarkR3ality
@StarkR3ality Месяц назад
depends what res like you've said, but a lot of the games alex mentioned and others, I'm bottlenecked on a 5800x3d with a 4070 super at 1440p in a lot of modern titles, which is crazy right? For me, only use case for 90 class and even 80 class GPU's is if you're going to be using 4K and ray tracing in every title available. When you're getting GPU's like the 4090 that are doubling performance gen on gen, and then you get a what? 20% increase in perf going from a 5800x3d to the 7800x3d. CPU's cannot keep up and it's really starting to show.
@WhoIsLost
@WhoIsLost Месяц назад
@@StarkR3alitysomething must be wrong with your PC if you’re getting bottlenecked with that hardware. 5800x3d is only 12% slower than the 7800x3d in 1440p
@StarkR3ality
@StarkR3ality Месяц назад
@@WhoIsLost defo not pal can assure you, I still great good performance, and I'm not talking about every title only the recent big ones, Baldurs gate 3, cyberpunk, witcher 3 next gen. My point was is if I'm CPU bottle necked at times in some titles, what's a 4090 gunna be which I think offer 2.5x the performance? not to even mention the 5090.
@JBrinx18
@JBrinx18 Месяц назад
​@@StarkR3ality4090 is only ~60% stronger than a 4070 super. But yes, CPU performance is an issue... I think 4K will be a problem, and there's just not a market for 8K... The only avenue that's available might be VR
@Plasmacat91
@Plasmacat91 Месяц назад
@@WhoIsLost Negative. I have a 5800x3D and 6900XT and am CPU limited most of the time at 1440p.
@gnoclaude7945
@gnoclaude7945 Месяц назад
Il ride my 4070 Super until next gen consoles drop. Thats when the jump to ai NPU's for gaming and other new features will push me to upgrade. Honestly its overkill for 1440p in the games i play at the moment.
@LeoDavidson
@LeoDavidson Месяц назад
Until it does triple 4K at 240Hz without DLSS or frame generation, there's always room for more. :) Whether there's enough of a market for that outside of the sim-racing and flight-sim niches, I don' tknow, but I'd buy one.
@clockworklegionaire2135
@clockworklegionaire2135 Месяц назад
Thats not happening ever with new features keep coming out like RT and PT
@fcukugimmeausername
@fcukugimmeausername Месяц назад
Battlemage will do this.
@clockworklegionaire2135
@clockworklegionaire2135 Месяц назад
@@fcukugimmeausername You are clearly out of your mind
@xpodx
@xpodx Месяц назад
8k 144hz will come out soon definitely need more power. Never enough.
@Sota_eth
@Sota_eth Месяц назад
64k even
@Monsux
@Monsux Месяц назад
I will just use DLDSR + DLAA on a 4k 120 Hz TV/monitor. CPU won't be the limiting factor, and I'm always getting graphical upgrade. Add path tracing with maxed out settings and the GPU (even RTX 5090 TI Super) would scream for help. I just love DLDSR and how versatile it is for all type of games… Doesn't matter if I'm playing new or older titles.
@Boss_Fight_Wiki_muki_twitch
@Boss_Fight_Wiki_muki_twitch Месяц назад
Considering the 5090 will be $1700 at least, it needs to be twice as fast as the 4090 to be worth it.
@SuperSavageSpirit
@SuperSavageSpirit Месяц назад
Rumor is it's 70%.
@Chuck15
@Chuck15 Месяц назад
twice? 🤣🤣🤣🤣
@squirrelsinjacket1804
@squirrelsinjacket1804 Месяц назад
70% raw performance improvement, along with a better version of frame gen to boost frame rates even more than the 40 series version would be worth it.
@daniil3815
@daniil3815 Месяц назад
that's a weird argument.
@dpptd30
@dpptd30 Месяц назад
I don't think so, it likely will be above $2000 due to them using the same node as Ada and Hopper, their data center B100 is already twice as large as the H100 in order to have a decent performance uplift on the same node, so twice as large on the same node should mean twice as expensive, especially when they still aren't using chiplets yet.
@garethperks7032
@garethperks7032 Месяц назад
Thankfully AMD have opened a door for us with X3D. CPUs finally start becoming useful for gaming with ultra high speed memory (e.g on-die cache).
@blast_processing6577
@blast_processing6577 16 дней назад
At this point, graphics cards are a side-hustle for NVidia's primary business: AI infratructure.
@ijustsawthat
@ijustsawthat Месяц назад
Not if it burns your whole house down. Cant they design a better connector instead ?
@user-rt4ct6tq3r
@user-rt4ct6tq3r Месяц назад
Can you have more strength than a 12yr old virgin and properly seat the connector?
@ijustsawthat
@ijustsawthat Месяц назад
@@user-rt4ct6tq3r if you think it's force related, you need to watch/read more on the topic. GamerNexus did a full investigation on these connectors, and clearly demonstrated how they are flawed by design.
@rambo9199
@rambo9199 Месяц назад
@@user-rt4ct6tq3r Has this ever been a problem in the past for you? Are you speaking from experience?
@user-rt4ct6tq3r
@user-rt4ct6tq3r Месяц назад
@@rambo9199 deflect more virgin boy.. wanna see a video of my 4090 ive had since launch working perfectly fine? I bet you dont 😂
@Oliver-sn4be
@Oliver-sn4be 12 дней назад
​​@@user-rt4ct6tq3rwhat if you move your pc and it comes Auth a bit hmm ? It is not like you open the case always to see ther is always a cance for it and most of all even if u put it all the way in It can still burn 🔥 ther is never a 100 that it won't 😂
@British_Dragon-Simulations
@British_Dragon-Simulations Месяц назад
My RTX 4090 GPU usage percentage is all over the place in 4K and in the Pimax Crystal (2x4K) with an i9-12900k. It never stays at 99% or even 98%. It usually varies from 100% to 80% constantly in 4K. My RTX 3080 usually stayed at 99% to 98% in 4K. In GPU-Z I’m now limited by Voltage and the limiting factor line is always blue instead of green. Even when I overclock my P-Cores to 52 and E-Cores to 40. My next PC upgrade may need to be the 14900ks.
@Goblue734
@Goblue734 Месяц назад
I have an RTX 4090 the only way you would see me with a 5090 is if we can get 4K visuals with RT and can have at least 60 FPS rasterized performance no frame gen or DLSS.
@SafMan89
@SafMan89 Месяц назад
You're the 0.1% who upgrade top end GPUs each generation
@ZackSNetwork
@ZackSNetwork Месяц назад
What games do you play I can do that already with my 4090?
@JeremyFriebel
@JeremyFriebel Месяц назад
​@@ZackSNetworksame but 4080
@kevinerbs2778
@kevinerbs2778 Месяц назад
never going to happen in the next 3 years. RT takes about 100x more computational power than rasterization does & most RT relies on rasterization as a base. Expect Blackwell to only be 20%-30% faster than a RTX 4090 at most. unless Blackwell comes out with a massive 224 R.O.P.'s ore more it isn't going to be that fast.
@aberkae
@aberkae Месяц назад
dlss set to dlaa is where it's at though imo. Better than TA AA native resolution.
@chrissoucy1997
@chrissoucy1997 Месяц назад
GPUs are getting faster at a pace that CPUs can't quite keep up with. I have an RTX 3090 paired with a Ryzen 7 5800X 3D and in games with heavy ray tracing like Spider Man Remastered, my 3090 is CPU limited even at 1440p and to some degree at 4K. I am itching a CPU upgrade right now way more than a GPU upgrade, my 3090 is still fine. I am looking to upgrade to a 9800X 3D when it comes out.
@vitordelima
@vitordelima Месяц назад
Stupid methods of rendering that move too much data around, need to rebuilt complex data structures all the time, ...
@StreetPreacherr
@StreetPreacherr Месяц назад
It sounds like the game engines aren't designed 'properly'... Can't the (RTX) GPU handle most of the processing necessary for high quality ray tracing? I didn't realize that even with RTX that Ray Tracing was most often restricted by CPU performance! Isn't the GPU supposed to be doing all the additional Ray Tracing processing?
@vitordelima
@vitordelima Месяц назад
@@StreetPreacherr There is a lot of intermediate steps that need a lot of CPU, in case of raytracing it seems the spatial subdivision is one of them for example. Uncompressed assets, assets with excessive detail, poor code parallelism... are other examples of causes of bottlenecks in general.
@ZackSNetwork
@ZackSNetwork Месяц назад
That’s because the Ampere GPU’s handled data weird. The 4070 Super is way faster than a 3090 in 1080p and faster in 1440p as well.
@mackobogdaniec2699
@mackobogdaniec2699 Месяц назад
Spider-Man is a very specific game, it is heavily CPU-limited (but with high fps, not like DD2), but it is an exception. It is hard to find visible CPU-bottlenecks in most games in 4k, with 4090 and top CPU (or even new mid or smth like 5800X3D). If we're talking about RT I think it heavily differ from game to game. It's very CPU demanding in S-M:R or especially Hogwart's Legacy, but not at all in CP2077.
@powerpower-rg7bk
@powerpower-rg7bk Месяц назад
The thing I'd be hoping for on a RTX 5090 would be two 8 pin power connectors and a return to more sane power consumption. More/faster VRAM would be nice too as I feel that that has been the RTX 4090's bottleneck, especially at 4K. Other grab bag of features would be the return of nvLink/SLI support to scale up via multi-GPU and integrating some Thunderbolt 5 controllers. It'd be nice to be able to plug a USB-C monitor directly into the GPU without external cabling and get full USB support on the display and other peripherals connected to it. Similarly with Thunderbolt 5, it'd be clever to include a mode where you could use the GPU externally for a laptop without the need for a Thunderbolt bridge board in an external chassis. Literally just the card, power supply and a power switch to turn it on. The PCIe slot connector would go unused.
@KontikiChrisSK8FE
@KontikiChrisSK8FE 20 дней назад
I would honestly not mind if it would be too fast or even more power efficient with the new architecture, because I would mainly use the card like that for rendering purposes.
@CrashBashL
@CrashBashL 20 дней назад
That's why we need an ARM/Risk architecture as soon as possible.
@dystopia-usa
@dystopia-usa Месяц назад
Once you hit certain quality-experience performance thresholds in gaming, it doesn't matter & becomes overkill for the sake of giggles. It only matters to professional bench markers & internet braggarts.
@jaredangell5017
@jaredangell5017 Месяц назад
The 9800x3d will be able to handle it. Nothing else will though.
@MarcReisSyllogism
@MarcReisSyllogism 20 дней назад
VR Performance would be one thing looking at the like of devices like Pimax 8K and 12K HMD's sure can burn some GPU Time (and CPU), ofcourse AI - maybe AI api for gaming use and the bethought PhysiX return and ever more RTX (even in VR).
@phizc
@phizc Месяц назад
Cyberpunk 2077 with path tracing in VR using VorpX/Luke Ross with a render resolution of ~10000x5000 at 90+ Hz would probably break the 5090 Ti Egregious Super too. Maybe when 8090 TiTi Super Duper comes around.
@Vincornelis
@Vincornelis Месяц назад
The CPU side is interesting cause CPU limitations seem to be universal to all modern CPU. Like pretty much every major release will either run perfectly fine on the now iconic 3600 or if it's struggling on that it's struggling on everything. Havinh a faster CPU with more threads seem to not make much of a difference at this point. Most modern CPU don't look in any inherent danger of being underpowered for the job. It's just game developers struggling to get to grips with the multithreading and that affects all modern CPUs pretty much just as badly.
@bricaaron3978
@bricaaron3978 Месяц назад
*It's just game developers struggling to get to grips with the multithreading..."* No, it's just that ever since the Great Consolization of 2008 all AAA games have been designed and coded for console HW. Current consoles have only 6 cores available to games. But further, those cores are considerably less powerful than even the cores of an 11-year-old 4770K.
@orlandoluckey5978
@orlandoluckey5978 20 дней назад
@@bricaaron3978 cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy
@orlandoluckey5978
@orlandoluckey5978 20 дней назад
@@bricaaron3978running at 3.8ghz
@bricaaron3978
@bricaaron3978 19 дней назад
@@orlandoluckey5978 *"cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy"* I repeat: Both the PS5 and the XBox Series X have only 6 cores available to games, just like the PS4 and XBox One. Each of those cores has a significantly lower FLOPS than a 4770K from 2013.
@Zapharus
@Zapharus Месяц назад
"Hi guys exclamation point" LOL Dafuq! That was hilarious.
@odiseezall
@odiseezall 23 дня назад
it's all about VR.. 50x0 series will be the first that's really capable of full immersion hq hi-res VR.
@Skrenja
@Skrenja 13 дней назад
Yep. PCVR is where these overkill cards will shine.
@kaslanaworld4746
@kaslanaworld4746 29 дней назад
well the 9800x3d should be releasing soon once the 5090 launches
@johndavis29209
@johndavis29209 Месяц назад
Rich is a gem.
@xXmobiusXx1
@xXmobiusXx1 Месяц назад
The problem is x86, no matter how low level your API is, you still have to run everything through the CPU due to how the PCI bus works. Basically we would need a bypass of some sort akin to what AGP did.
@mkreku
@mkreku Месяц назад
I use an RTX 3090 right now and I would actually be interested in upgrading if they would start making mini versions of GPU's again. The 40XX series (and AMD's 7900 series) are all gigantic and since I build ITX rigs, they're just not interesting to me. But imagine if they used the smaller processor nodes to keep the performance the same and instead used the node advantage to shrink GPU's. One can dream.
@Katastra_
@Katastra_ 22 дня назад
Part of the reason why I went with the NR200 for my first itx build. I still remember seeing someone with a 4090 in theirs lol makes my 3080 ftw3 look tiny
@krz9000
@krz9000 6 дней назад
There is no problem keeping a gpu bussi with pathtracing. More samples...more bounces...until we reach unbiased territory
@DefinitelyNotPedro
@DefinitelyNotPedro Месяц назад
Honestly maybe a slight increase in performance from a 4090 but with much more efficiency would be awesome. Imagine 4090 level performance at only 200 watts for example, it would be crazy!
@ZackSNetwork
@ZackSNetwork Месяц назад
Dude then just get a 5080 then. The 5090 will have %60 more rasterization performance and 2.5x better raytracing than the 4090. While pushing the same amount of power. It’s called “performance per watt” not “low watt output”.
@DefinitelyNotPedro
@DefinitelyNotPedro Месяц назад
@@ZackSNetwork im not going to get either, i was commenting on what i think would make a good product.
@lharsay
@lharsay Месяц назад
That might happen in 2 or 3 generations, not in one. The 4060Ti just reached the 2080Ti's performance under 200W but the 2080TI was a 350W card at most, not 450W.
@DefinitelyNotPedro
@DefinitelyNotPedro Месяц назад
@@lharsay maybe! Im just speculating here, the 4060 has around the same performance as the 3060 but uses 50% less power
@user-rt4ct6tq3r
@user-rt4ct6tq3r Месяц назад
Imagine undervolting.. derp. My 4090 runs at 3ghz with an undervolt and in most games im 300 watts or less.. peaks are 350ish.
@smurfjegeren9739
@smurfjegeren9739 Месяц назад
I just hope Ill be able to afford 5000 series cards. And to be able to fit it into my tiny case
@altaresification
@altaresification Месяц назад
I wonder if an ASIC on the GPU would be able to compile shaders on the fly, provided a new API is exposed for that.
@jackpowell9276
@jackpowell9276 29 дней назад
I mean with power GPU power comes more scope to add VFX, textures, higher resolutions, frames, VR etc.
@vertigoz
@vertigoz Месяц назад
Ehat i want is a 240hz 4k CRT
@nephilimslayer
@nephilimslayer 27 дней назад
AAA titles at 4k with rt on will be the sweet target for 5090
@doityourself3293
@doityourself3293 11 дней назад
The optical GPU is just around the corner. So is the optical CPU 10K times faster than we have now.
@diggler64
@diggler64 Месяц назад
i play dcs and msfs2020 with a 12900k and 4090 in VR and i can get the fps i want and then it's all a matter of turning knobs for how much visual quality you want and visual quality makes you gpu limited so .... for me a 5090 would probably help
@viktorianas
@viktorianas 3 дня назад
CPUs are lightyears ahead against GPUs on 4k res and above (VR).
@user-qv2wd2jc6m
@user-qv2wd2jc6m 9 дней назад
Nice conversations, but did they really answer the question? I think the user was asking that how can the most powerful consumer graphics card money can buy really enhance gaming experiences when we also need equally powerful CPU performance to make this happen which often causes choking on the GPU? I mean I can name a plethora of things video games are lacking today that REALLY need improvements to improve experiences compared to what we have had for the past decade or two, with some being physics in real-time(this will always need improvements), real-time water physics(this is EXTREMELY held back and limited because of limitations in CPU calculations), 3-D volumetric special effects(I believe this is GPU focused and CPU focused) as well as robotic NPC's that suck you out of immersion(this is heavily CPU needed as well), so point being we can have these powerful GPU's but if video game experiences don't improve we are stuck with the same mediocre gameplay with shiny polished graphics at 556 FPS. Also VR, to me at least, IS the absolute FUTURE OF GAMING, being able to actually transport yourself into these incredible worlds, and that requires much more powerful hardware both GPU and CPU. My hope though is that solutions are found to fix these concerns and allow developers to have more fun creating games and not be limited constantly by hardware. PS5 was really exciting when it released because of the way they saw SSD's bottlenecking a lot of freedom for developers, and they found a solution that even pushed outwards into the PC SSD market as well since most PC SSD's at that time weren't hitting speeds PS5 was because of I/O throughput issues. My hope is Cerny and his amazing team come up with bottleneck solutions for improving CPU performance, allowing more efficient multithread performance in games that will stretch out into the industry and become standard you know? And I imagine Ai will also assist in numerous ways with improving performance, user experience and helping developers make games more polished and advanced. SO exciting to think about...
@capslock247gaming9
@capslock247gaming9 Месяц назад
I have Samsung Odessa g8 4K 240hz . I’m running i9 14k 3090 TI I’m hitting 140 fps on ultra settings on story games and on fps like cod and apex low setting I peak to 240 hz and once you get used to 4K good luck going back to low resolution
@R-yb6xt
@R-yb6xt Месяц назад
high end VR-oriented features? not melting?
@kathleendelcourt8136
@kathleendelcourt8136 Месяц назад
People getting GPU limited in 95% of their games: ... The same people getting CPU limited in the remaining 5%, of which only 1% actually results in a sub 100fps framerate: OH NO I'M CPU BOTTLENECKED!!
@GabrielPassarelliG
@GabrielPassarelliG Месяц назад
There are ways to spend the GPU extra power, like on monitors of higher resolution and refresh rate. And if you care much about a specific game where CPU bottleneck is a thing, then invest more in CPU and less in GPU. Not hard, giving high tier GPUs cost multiples of high tier CPUs.
@johnnymosam7331
@johnnymosam7331 Месяц назад
Yeah pretty much. CPU rarely bottlenecks unless you're playing an RTS.
@FantasticKruH
@FantasticKruH 25 дней назад
Not to mention that 4090 and 5090 are 4k cards, even older cpus rarely bottleneck the 4090 on 4k.
@mushroom4051
@mushroom4051 Месяц назад
Lazy optimization,makes hardware upgrades a must,look at mgsv fox engine can run on old cpus
@johndzwon1966
@johndzwon1966 Месяц назад
CPUs only bottleneck at low resolution/high refresh rates. Just means that when it's time to upgrade the CPU, I won't have to worry about updating the GPU (future proofing).
@SuprUsrStan
@SuprUsrStan Месяц назад
Just get a G9 57" monitor or any other 4K+ monitor. You'll instantly be GPU bound again.
@ainzoalgownz1570
@ainzoalgownz1570 10 дней назад
7800x3D is rarely being used above 50-60% in games using the 4090. (at 4k) I dont think it will bottleneck the 7800x3D when it comes to gaming at all. Will still be an amazing CPU for it.
@Cblan1224
@Cblan1224 5 дней назад
Can't wait for games to be even less optimized and just throw a 5070 ti at the recommended specs
@vexun11
@vexun11 23 дня назад
Will the i9 14900k be able to work well with a 5090 even at 35000 watts?
@ymi_yugy3133
@ymi_yugy3133 Месяц назад
Improvements are probably gonna be modest, both in software and in hardware. More interpolated frames is not bad, but it's in the marginal gains category. I see a couple of fronts where they could make progress, though most of this probably won't come with Blackwell. Generated predicated frames. Instead of interpolating real frames, the next is simply predicated. This get's rid of the latency issue. A more integrated neural rendering approach. Lot's deep learning powered NPCs in games.
@OldMobility
@OldMobility Месяц назад
No it’s not, gaming in 4K with Ray Tracing even with DLAA or DLSS is the most beautiful graphics I’ve ever seen and that’s just with my 3090. With a 5090 I can turn off DLSS and go Native and still get spectacular FPS.
@tommyrotton9468
@tommyrotton9468 Месяц назад
maybe not if the GPU of the 5090 is used to take AI calculations from the CPU, like it did with PhysX. It could well max out the htz of 1080p and 1440p, but 4k is getting to 144hz
@FMBriggs
@FMBriggs Месяц назад
I love questions like this because they assume on some level that large companies (Nvidia, AMD, Microsoft, Intel etc) wouldn't be thinking about bottlenecks or be actively working on developing new ways to utilize cutting edge hardware.
@boedilllard5952
@boedilllard5952 24 дня назад
I'd love to see less power draw - not gonna happen, lower price - not going to happen, thinner - not gonna happen so that leaves about 4 times the ray tracing performance - probably not gonna happen.
@0wl999
@0wl999 16 дней назад
' 4K screens coming online now... ' rofl as if they haven't already been released by a year or two.
@Accuracy158
@Accuracy158 3 дня назад
My 4090 is already CPU limited in basically all games (and very CPU limited in many of my favorite games) but I play at 1440.
@HielUFF
@HielUFF Месяц назад
I have no idea why they are talking about 1080p, it is a GPU designed and designed for 4k or more and in those resolutions it is very clear that it will not have problems with current generation processors even with its mid-range whether it is 7600 or I5, hi From Venezuela !!
@deathstick7715
@deathstick7715 Месяц назад
What are they going to do after the 9090 comes out will it be the 10k90
@Alp577
@Alp577 Месяц назад
Probably go back to small number like how AMD did it with their HD7970 > RX 290 > RX 390 etc..
@alexis1156
@alexis1156 Месяц назад
I think it's too hard to tell. On 4k probably not, lower res though, maybe But next gen of cpus is also looking great.
@GabrielPassarelliG
@GabrielPassarelliG Месяц назад
Who'd by a 5090 to play in less than 4k?
@alexis1156
@alexis1156 Месяц назад
@@GabrielPassarelliG Probably no one i would say.
@FantasticKruH
@FantasticKruH 25 дней назад
I really doubt it on 4k. 4090 is pretty much the bottleneck in 4k even if you have an older cpu.
@InternetOfGames
@InternetOfGames 16 дней назад
I'm getting ready to add a new CPU to my 4090 before adding a 6090 to my next CPU.
@1stcrueledict
@1stcrueledict 28 дней назад
Best current pairing for a 5090 will be the 7800x3d. its only like 300 bucks and the am5 platform is gonna be around for a few more years, we'll get a more compatible one at some point in the future, no use wasting the extra 300 on a 7950. or wasting 700 on having to replace an intel board and cpu.
@yourhandlehere1
@yourhandlehere1 10 дней назад
Frame gen is like margarine gaming. I like real butter.
@Fiwek23452
@Fiwek23452 Месяц назад
The current 4080 and 4090 are already bottlenecked by current cpu’s, hyper threading and e cores are total bs
@numlock9653
@numlock9653 Месяц назад
The problem still remains lack of multi threaded optimization in games. Modern CPUs just cant fix the fundimental nature of games being programmed to rely highly on a single thread . It would be in Nvidias best interest to explore the use of AI to potentially solve this issue. Else there is little reason to scale much beyond a 4090 without a miracle in single threaded cpu performance.
@nhmgmmmmhjm
@nhmgmmmmhjm Месяц назад
games use to be gpu bound, over the past few years, now they are both cpu and gpu, impressive regression in developing games
@BlueRice
@BlueRice Месяц назад
Any performance gain from cpu, and gpu is always a gain. Now old age questions, how much people willing to pay for that little or a lot of gain?
@mrmrgaming
@mrmrgaming Месяц назад
If they rush a 2024 launch, I will wonder more about 4090 owners needing to upgrade. Normally, there is some big, melts your PC (no pun) game that calls for the upgrade, but the only one I can see that might gain from a 5090 is Stalker 2. I would have thought a mid-2025 would have been better.
@armandocardona6975
@armandocardona6975 Месяц назад
1440p 32:9 ultra widescreen at 240 hz is the goal
@FreakyAndrew428
@FreakyAndrew428 Месяц назад
I wish my games will be cpu limited when I get a RTX 5090 to use on my Neo G9 S57 with 7680x2160 @ 240hz😅 So here you have a real usecase.
@mahouaniki4043
@mahouaniki4043 Месяц назад
$2500 for +15% max over 4090.
@francoisleveille409
@francoisleveille409 Месяц назад
A Geforce RTX 3060 is right at home with an old i7-4790 especially when playing games in 4K so a 5090 would be right at home with either a 13th/14th generation i9 or Ryzen 9 7950X/7950X3D.
@vonbleak101
@vonbleak101 21 день назад
I play @1080p and have an i9 13900k and a 3080... I have no issues with any modern game and prob wont for another couple of years at least lol... The 5090 would be insane for me haha...
@lil----lil
@lil----lil Месяц назад
U know what's gonna happen? Nvidia be like, so guys can't keep with us huh? We gonna design our own CPU! That's _exactly_ what happened to Intel. Apple be like, get you $hit together or we going our own way. Now we have M1/2/3 and soon 4. Watch Intel/AMD stocks TUMBLE when Nvidia announce their own CPU!!! AMD/Intel better get their $hit together.
@bartjandejong9412
@bartjandejong9412 5 дней назад
No, i use VR on ACC and with my 4090 still cant run 100% resolution. I need the 5090
@yumri4
@yumri4 Месяц назад
The thing of expecting the game developer to code DX12 code to the CPU to handle GPU tasks better has an issue. 1 major one right now of right now DX12 is a wrapper but to code for what they are talking about the wrapper will have to change and the change will mean limited CPU support. Even in the same SKU might not be supported if they do it wrong and remove to much of the wrapper to get to the CPU to say to let the GPU do GPU to itself not GPU to CPU to call the GPU thing. There are so many ways it can be messed up there is a reason why the API was made. Yes not using it is quicker just coding for the CPU whiich is part of what DX12 will allow you to do is a thing. It is a think i think was made for the datacenter not the consumer. Like .net games it contains both data center parts and consumer parts. In the data center you know the CPU model in every server as you can go look it up to know what to put into the code. For games you have literally millions of possiable CPUs that might be in the system. So saying the game will only work on this gen of CPUs not the next not the before for this one will kill the game after that gen of CPU is no longer sold. So good idea good for the server market bad for the consumer market. Now the part you go through the DX12 wrapper for is slower than it allowing you direct access to the hardware. How much slower? 1 to 100 cycles slower which isn't that much when you figure in the CPU is usually doing nothing waiting on another part of data to be sent to it. This part is used as the code doesn't have to be for that CPU but any x86 CPU. Yes there is a slow down but with all APIs you have a slow down. The RTX 5090 will most likely be bottlenecked by the coding not the CPU. When in a pure GPU load the RTX 5090 will most likely be bottleneck by the PCI gen 4 x16 speed not the CPU. Better coding is needed but easily readable by humans coding is what is taught in schools. So the compiler used doesn't always make the objective best binary output. What is used like you know is the binary output that basically is a go there do this file.
@tryharder1053
@tryharder1053 29 дней назад
Black myth wukong and my 8K mini led TV enters chat
@saesang352
@saesang352 Месяц назад
Honestly just stop it with this CPU nonsense. Unless you are competitive gaming at 120-240hz, a 3700X will be fine for 60fps for another 4-5 years.
@chy.0190
@chy.0190 Месяц назад
Its not nonsense, saying a 3700x would be fine for another 5 years is crazy when it bottlenecks the latest gpus now on high end gpus even at 4K.
@XAMPOL
@XAMPOL 22 дня назад
Not everyone plays on 1080p/1440p.... and only 60 FPS??? you're kidding me!
@RafaelSilva-yv3oh
@RafaelSilva-yv3oh Месяц назад
Can it run Cyberpunk at 480hz 1440p? Cause that's gonna be my next monitor.
@SpOoNzL
@SpOoNzL Месяц назад
5090 needs a built in smoke detector above 12VHP cable.
@tamish3551
@tamish3551 Месяц назад
Nvidia is bundling it with a popcorn attachment so if your gpu burns at least you got a snack
@eliadbu
@eliadbu Месяц назад
I thought about some elegant solution like 12vhpwr port with integrated temperature sensor or some way to sense resistance in the connector. Or make sure the new 12v-2x6 are working as they're supposed to.
@superthrustjon
@superthrustjon 26 дней назад
No joke, just cashed in about 1,000,000 Marriott points for over $3,000 in Best Buy gift cards 😂 getting ready for the 5090
@PaulRoneClarke
@PaulRoneClarke 19 дней назад
As someone with no interest in resolutions above 1440, and can personally perceive no benefit from frame rates above about 80… none of this bother me… at all.
@kilroy987
@kilroy987 22 дня назад
Oh, I'm sure I can make a 5090 choke with PCVR.
@charizard6969
@charizard6969 Месяц назад
This is a non issue at 4k when ryzen 9000 comes out, it is set ti improvw gaming 17,3% across the board
@hartyeah
@hartyeah 22 дня назад
I’m gaming on 7680x2160 samsung g9 57” with 4090 and 7800x3d. I sure hope I can get more fps with the 5090.
@neti_neti_
@neti_neti_ 15 дней назад
It is unlikely that even the upcoming Intel Arrow Lake 15900K or AMD Ryzen 9 will be able to keep up with the RTX5090. Intel, in particular, will need to introduce its Novalake (16P core xxE core) with DDR6 RAM to support the RTX5090 ASAP .
@bamazeen
@bamazeen 21 день назад
Way to bring the energy boys.
@jocerv43
@jocerv43 Месяц назад
I need it for my Minecraft shaders
@brkbtjunkie
@brkbtjunkie Месяц назад
Depends on your target framerate obviously. Also not everyone likes the DLSS reconstruction techniques specifically in motion. A 5080 is probably going to be my upgrade from a 3080.
@RiderZer0
@RiderZer0 Месяц назад
I’d be most interested in how much quicker 5090 renders in blender than the 4090. My 3090 still takes forever in highly detailed projects.
Далее
Haydarlar oilasida tug'ilgan kun | Dizayn jamoasi
00:59
5 steps to lose belly fat !! 😱😱
00:18
Просмотров 2,4 млн
BASTA BOI MAGIC SECRETS
00:50
Просмотров 11 млн
FSR 3 Console Frame-Gen Tested... And It Works!
16:46
OLED vs LCD - John And Oliver Buy New TVs!
16:06
Просмотров 36 тыс.
VRR Flicker On OLEDs Is A Real Problem
11:15
Просмотров 137 тыс.
Is Ray Tracing Support Losing Traction?
7:52
Просмотров 59 тыс.
Haydarlar oilasida tug'ilgan kun | Dizayn jamoasi
00:59