Тёмный

Is the fastest GPU ALWAYS the best? RTX 4090 Review 

Linus Tech Tips
Подписаться 16 млн
Просмотров 2,7 млн
50% 1

🌏Get Exclusive NordSecurity deals here ➼ nordsecurity.com/linus All products are risk-free with Nord's 30-day money-back guarantee!✌
Get 50% off on your annual Zoho CRM subscription at: lmg.gg/ZohoCRM
Nvidia’s GeForce RTX 4090 is a GPU that’s been rumoured to be a power-hungry beast since it was first leaked. Is it really that bad? And can its gaming performance possibly justify its massive price tag?
Discuss on the forum: linustechtips.com/topic/14604...
Purchases made through some store links may provide some compensation to Linus Media Group.
► GET MERCH: lttstore.com
► SUPPORT US ON FLOATPLANE: www.floatplane.com/ltt
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
► PODCAST GEAR: lmg.gg/podcastgear
FOLLOW US
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
0:57 RTX 4090 - More than a mere refresh?
2:51 Test Setup and why we didn't run 22H2
3:15 4K Gaming Results
4:56 1440p Gaming Results
5:26 Ray Tracing & DLSS Gaming Results
6:36 Where's DLSS 3.0?
7:17 Productivity Results
8:15 AV1
9:27 Power consumption
10:30 Thermals & clock stability
11:31 Case air temperature
12:16 Some quirks... DisplayPort and PCI Express
14:37 Conclusion

Наука

Опубликовано:

 

25 июн 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 6 тыс.   
@LinusTechTips
@LinusTechTips Год назад
CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically *didn't* have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card: *No RT, no DLSS (1%low, 5%low, avg): - RTX 4090: 54, 69, 81 - RTX 3090 Ti: 43, 46, 56 - RTX 3090: 35, 43, 50 - RX 6950 XT: 30, 39, 46 *RT, no DLSS (1%low, 5%low, avg): - RTX 4090: 36, 39, 44 - RTX 3090 Ti: 17, 22, 26 - RTX 3090: 16, 19, 23 - RX 6950 XT: 10, 11, 13 *RT + DLSS (1%low, 5%low, avg): - RTX 4090: 94, 97, 108 - RTX 3090 Ti: 58, 60, 67 - RTX 3090: 52, 53, 61 - RX 6950 XT: N/A -AY
@BayareaXotics
@BayareaXotics Год назад
Heyyy daddy Anthony
@zimnium1
@zimnium1 Год назад
Hey 5 minutes early to the comment:) And hey Anthony👋
@ProjectSmithTech
@ProjectSmithTech Год назад
I knew it, cyberpunk was way too high. Thank you for the correction.
@Keemochi420
@Keemochi420 Год назад
guess im staying with my RTX 3080 then id wait for 50 series
@moochoopr9551
@moochoopr9551 Год назад
Need to be pinned
@danielmaurel3195
@danielmaurel3195 Год назад
it´s impressive how a 600 bucks GPU is bargain now... we need serious competition in the GPU market
@HerroYuy246
@HerroYuy246 Год назад
INTEL
@Laesx
@Laesx Год назад
I got a 3080 ti FE for that price and it truly was a bargain
@JoeWayne84
@JoeWayne84 Год назад
This comment was 2 minutes after video was released
@baseplate_462
@baseplate_462 Год назад
People who work on a budget, or anyone who isnt in an industry or job that might require the latest and greatest should not be foaming out the mouth to buy a new GPU. 2060s are available for sub 300 dollars on amazon and 3060s regularly at 400. i sit here doing game development, blender work, play my favorite games, steam my favorite content all on a 270$ 2060. Im generally baffled at the people who look at GPUs that SHOULD cost 1k+ and go "its not fair i cant buy that" when 2060s and 3060s exist. Its like looking at your 2018 ford focus and going "i cant believe they are selling lambos at that price, now im never gonna be able to get one." AS IF YOU NEED A LAMBO TO GET TO YOUR JOB AT BURGER KING. Your ford focus is reliable and when it breaks down, unlike a lambo- it isnt the end of the world.
@samson_the_great
@samson_the_great Год назад
600 is what a responsible adult should be making in 3 days. If you can't afford a gpu you shouldn't even be on RU-vid right now.
@Neoxon619
@Neoxon619 Год назад
As powerful as this is, I honestly can’t justify the price. Not to mention the cooling & space requirements for the 4090. And the fact that the 4000 series doesn’t have DisplayPort 2.0 is legitimately baffling.
@Zapdos0145
@Zapdos0145 Год назад
wait are you serious, it DOESNT have DP 2.0
@2ndcitysaint52
@2ndcitysaint52 Год назад
huh?? its way easier to justify this price vs the 3090 or 3090ti? Im sorry but if you are someone who always buys the bleeding edge gpus this is the 1st time it doesn't feel terrible to upgrade. yea the dp 2.0 its shitty but the uplifts at 4k are pretty nice here.
@fortwentiblazeit4177
@fortwentiblazeit4177 Год назад
isn't this video posted like 8 minutes ago? How are you able to finish the whole 16 video in 4 minutes :o
@Neoxon619
@Neoxon619 Год назад
@@Zapdos0145 Nope, it’s still 1.4.
@xXRealXx
@xXRealXx Год назад
@@fortwentiblazeit4177 by watching at 2x speed or higher
@kabiro2151
@kabiro2151 Год назад
I have never wanted Intel to succeed so bad.
@thatdogeguy9108
@thatdogeguy9108 Год назад
Or amd
@euko6876
@euko6876 Год назад
Lmao same. It doesn't feel right 😄
@dennismwangangi
@dennismwangangi Год назад
OUt of context I tried displaying on 2 3 screens using an nvidia GPU, and it brought stability issues that could result to BSOD but then I bought USB display adapters and they were stable intel enabled the and no issues arose.
@euko6876
@euko6876 Год назад
@@dennismwangangi doesn't sound like you ultimately fixed or figured out was wrong. Just used an alternative method to use the gpu.. but okay 👍🏽
@f2pgpb993
@f2pgpb993 Год назад
@@thatdogeguy9108 amd is on the same pricey path as nvidia sadly they would be if their old policy at the time of the 4870 still was on course i remeber at the time they were on a policy to make pc vid cards cheaper
@Yomismo28
@Yomismo28 Год назад
I remember buying a 1060 for around 300€ and my parents mocking me for spending so much on it. The look on my mother's face when I told her about these gpus prices was priceless
@Zach014G
@Zach014G Год назад
i bet she took back that statement now LOL
@AdrenResi
@AdrenResi Год назад
almost the same story here
@vuri3798
@vuri3798 Год назад
The first GPU I bought was 11 years ago, a GTX560 for 200 bucks, BF3 and every other games runned at ultra with over 60fps (120hz just started to be a thing back then)... Now you have to spend double for a xx60 card so you can play current demanding games at medium/high settings. What a time to be alive.
@Argedis
@Argedis Год назад
@@vuri3798 I bought my 1660 Super for $220 The days of the 60-series cards being the best 'budget' card are gone
@scaredycat8685
@scaredycat8685 Год назад
boomers think 18 year olds can go buy a house and live on their own out of high school still ... lol good luck
@yt_clazify
@yt_clazify Год назад
It feels like the 3080 and 3090 came out yesterday
@KanecicqSTUDIOS
@KanecicqSTUDIOS Год назад
I just got 3060 Ti in in May, cuz 3080 prices were still madness at the time
@garrettyates647
@garrettyates647 Год назад
I think a very valid test would be a "office" or "bed room" size room with something like this. We measure ambient and output, but more airflow = moving more hot air = more hot air into the small room. If that room is not near a thermostat you're gonna have a small sauna in your home office or a freezer in your living room while trying to keep the office cool.
@thenoblehacker9111
@thenoblehacker9111 Год назад
Way too expensive, yet still extremely impressive and powerful
@PumpyGT
@PumpyGT Год назад
Yeah it really is
@Iskandr314
@Iskandr314 Год назад
@@PumpyGT test is not true...
@symphoricquoz3763
@symphoricquoz3763 Год назад
@@Iskandr314 Are you able to describe *why* the test isn't "true"? Because that seems like a strong claim.
@Butzemann123
@Butzemann123 Год назад
@@Iskandr314 so a test for ONE GAME isnt 100% correct. So what?
@mr.darknight416
@mr.darknight416 Год назад
Its not impressive when the price is almost 2k. It would be impressive if it was at a reasonable price. I mean before they too could make mote power gpus, but they were limited by price, where people wouldnt even consider buying a 1k gpu let alone 2k. Remember in 1080 era 800$ was too much. I mean everyone can make the fastest gpu if they got unlimited price and see a gpu for 5k or 3k.
@GERMAN2FUCK2DOWN
@GERMAN2FUCK2DOWN Год назад
Being honest. The performance is impressive and it would be a dream to game with that card. But the price tag and power consumption in todays market just makes me want to travel back in time
@picoplanetdev
@picoplanetdev Год назад
Couldn't agree more. I got a good deal on a 3060 Ti and I don't think you could justify much more than that.
@arnox4554
@arnox4554 Год назад
And the size... What the fuck?? Even the size of the 3090 vanilla was absolutely absurd.
@Darthquackius
@Darthquackius Год назад
I can't even get an ATX 3.0 power supply till December! how are we going to run these things!!
@SSoul0
@SSoul0 Год назад
@@Darthquackius off your standard psu...
@whiteXIchigo
@whiteXIchigo Год назад
the Power conusumtion is not even the problem, what LTT didn't checked how the FPS/Power consution curve is looking. It seems that you can safe way over 100w easy with lossing only a few percent in performance, 2-5%
@thwind
@thwind Год назад
Surprising differences in some results between reviewers. For example Jayz2cents had Cyberpunk 4K (ultra preset, RT/DLS off) average fps only at 76fps compared to LTT's 136fps. I wonder what could make such huge difference?
@elio564
@elio564 Год назад
I wonder the same thing. Vote this up lads so LTT sees this
@soaringspoon
@soaringspoon Год назад
They fucked it up DLSS was on.
@tiestofalljays
@tiestofalljays Год назад
@@soaringspoon LTT Labs getting off to a good start I see.
@ETin6666
@ETin6666 Год назад
Jayz' is the correct one. HardwareUnboxed got 83fps in 4k high dlss off.
@nasmeskartz9149
@nasmeskartz9149 Год назад
@@rustler08 Don't think so, since they got chart for both DLSS when off and on.
@IntergalacticViking
@IntergalacticViking Год назад
Would love to see some machine learning benchmarks in the productivity section!
@lilPOPjim
@lilPOPjim Год назад
It would have been interesting to see power draw at the same FPS to see how effective the card is generating the same media
@whasian1487
@whasian1487 Год назад
The enlarged GPU images in youtube thumbnails is usually for clickbait, but the 4090 may actually need to be downsized for thumbnails.
@mathsam7103
@mathsam7103 Год назад
For years we've been focused so much at the pinnacle of gaming pcs that NVIDIA's forcing us to look back onto practicality by their insane pricing. Maybe Intel has a point.
@MrPaxio
@MrPaxio Год назад
more competition came out, competition showed nvidia that you can charge similar prices for a crappier product, so their prices adjusted to make sense in the market space. not very surprising, its what the people wanted, apparently
@khalilahd.
@khalilahd. Год назад
lol true 😅
@nathanjokeley4102
@nathanjokeley4102 Год назад
the people that buy these kinds of cards are such a tiny market.
@chsi5420
@chsi5420 Год назад
The real money is in selling budget cards. Which is why Intel hopped into the pool with their arc cards. Nvidia is moving further into an enthusiast/professional PC direction. It's like a Toyota vs a Lamborghini, one works well for most, but the other is most desirable.
@info0
@info0 Год назад
@@nathanjokeley4102 RTX 4090 cards are aimed at hardcore, high-end gaming enthusiasts who demand the best there is. They don't care about prices. I did belong to that group for a long time, but times change, priorities in life change, so I dropped out of the race.
@bunsenn5064
@bunsenn5064 Год назад
I remember all the hype that came with the 30 series cards, especially the 3090, getting released. There was none of that with the 40 series. Back when the 20 and 30 series came out, me and my friends would talk about them all the time. No one said anything when the 4090 came out. It was so far out of reach that we just didn’t care.
@Miyano_Shiho4869
@Miyano_Shiho4869 Год назад
Ikr. It's now some kind of fantasy that we nod to and say "cool" and go on with our day knowing we can never get our hands near it
@stevieC11Hanworth
@stevieC11Hanworth Год назад
Just wait 5 years and get one
@admistyt
@admistyt Год назад
@@stevieC11Hanworth in 5 years something newer will be out
@dronred8817
@dronred8817 Год назад
you too do not casually carry 1800 EUR in pocket? What a coincidence))). I think not....
@Larimuss
@Larimuss Год назад
$2000 USD in australia. I could spend the money on it but I just can’t justify it. No way in hell. It would cost more than my entire system. With 9x fans $200 case, water cooling etc. their just ripping of loyal customers now and I hope they lose sales
@ThorDyrden
@ThorDyrden Год назад
Video-Encoding: for streaming h.26x and AV1 are important/the future... but for video-editing you often have to cope with ProRes RAW. (Apple) ProRes is a codec supported by Apple's SOCs of course - but I have never seen it being implemented on GPUs outside Apple - though you find it on a lot of cameras (Sony, Fujifilm,...). Is Apple prohibiting alternative hardware-encoders for "their" codec, or am I missing some other reason?
@dycedargselderbrother5353
@dycedargselderbrother5353 Год назад
The legal battles between Apple and RED surely scared off would-be implementors.
@mattb6646
@mattb6646 Год назад
They didn't put displayport 2.0 on the 4090 because they don't want these cards being used 5 years from now... so you'll have to buy the next gen gpu with the 2.0 port. It would be like making a car that last for 30 years, they want you to come back and buy another
@Sad_King_Billy
@Sad_King_Billy Год назад
Apple set that standard, now Nvidia wants a piece of the pie.
@512TheWolf512
@512TheWolf512 Год назад
@@Sad_King_Billy not apple. iSheep did.
@user-dm8ic8lj5z
@user-dm8ic8lj5z Год назад
AND pcie 5.0
@BadTomzi
@BadTomzi Год назад
Holy shit
@ShogoKawada123
@ShogoKawada123 Год назад
Used in five years with what though? The card has HDMI 2.1a, which can do 4K / 120 with no display stream compression
@zotac1018
@zotac1018 Год назад
Though pricing for these cards are beyond crazy. This is the first card where turning on Ray Tracing will finally makes sense ( when I buy them at discount some years later).
@esatd34
@esatd34 Год назад
around 7 years maybe.
@lunatik6168
@lunatik6168 Год назад
atleast after 5 years of my younger brother mumbling about dlss and ray tracing superiority, he might even use these features for the 1st time 😆 (nvm, he said his going amd after release 🙃)
@lejoshmont2093
@lejoshmont2093 Год назад
Another 3 generations it will probably preform decently on a 50series which means in a couple of generations after that you could expect pretty wide adoption. When those cards are hitting the used market.
@gstylez0107
@gstylez0107 Год назад
@@esatd34 Seven years? ..You're probably just being facetious, but it definitely won't take that long. In just three years, the 4090 will practically be chop liver.. The technology moves exponentially fast. Moore's law isn't dead, Nvidia is just full of shit.
@squirrelsinjacket1804
@squirrelsinjacket1804 Год назад
@@lejoshmont2093 I'm waiting to upgrade till the 50 series... the next 'big thing' is going to be path tracing in games and maybe by then the cards will be powerful enough to support it better for non-tech demo use.
@razorsz195
@razorsz195 Год назад
Kombustor is more of a Power delivery system stress test so it explains the behaviour of the cards, its good for burning in power delivery and making sure its up to the task but another bench program would do better for stressing the core and memory.
@jarodwingert5517
@jarodwingert5517 Год назад
Honestly thinking about one for gaming. Our house's breaker throws if I run my A/C and a 3080 while other rooms have A/C as well. If I frame cap, I can lower power usage without letting go of any performance.
@ironhammer500
@ironhammer500 Год назад
I bet they are saving displayPort 2.0 for the Ti Series just so they can bring them out half a year later at around 50% more cost and justify the price hike for just putting on a display port and add pci gen 5.
@gmiblessed
@gmiblessed Год назад
Would more likely be the “Super” refresh next year or the year after. Rehash of the Turing product strategy.
@mortiarty7842
@mortiarty7842 Год назад
That's most likely what they doing
@theChramoX
@theChramoX Год назад
DP 2.0 in TI cards in 25% more price, super 25% more price and pcie gen 5. dont at me.
@Scarlet_Soul
@Scarlet_Soul Год назад
The lack of Display Port 2.0 really is baffling
@Clawthorne
@Clawthorne Год назад
They're probably reserving that for their Quadro cards, so that companies who need higher refresh rates or resolutions are forced to pay 5x more. It's the Nvidia way!™
@AOTanoos22
@AOTanoos22 Год назад
@@Clawthorne Nope, RTX 6000 (Ada) also only has display port 1.4 which is a $5000+ GPU btw
@spaceduck413
@spaceduck413 Год назад
I feel like if RDNA3 is able to drive 144fps at 4k Nvidia are really going to be kicking themselves, to the point where we might even see a revision 2 or something like that. They've made it so that AMD doesn't even have to *match* their theoretical performance in order to outperform them in the real world. If your card can't drive more than 120hz due to bandwidth limitations, does it really matter how many "more" frames you get?
@Nightengale537
@Nightengale537 Год назад
@@spaceduck413 honestly I could care less about frames counts I really just like being able to play games with consistent frames which is why I like 30 and 60 fps no higher no less but honestly GPU's are absurdly priced which is why I don't even touch the RTX series and stick with my gtx 1060
@J-Rizzler
@J-Rizzler Год назад
@@Nightengale537 respect imma buy a 1050 to LP
@WayStedYou
@WayStedYou Год назад
14:03 the ARC cards were waiting in a warehouse for nearly a year and still have DP 2.0
@GhaziCHAIEB
@GhaziCHAIEB Год назад
That CS:GO bug could be a rebar bug. It would explain why it would only appear in 4K, and resizable bar support has been buggy especially with games not built for it. I'd have to do my own testing to figure that out, however.
@thelaitas
@thelaitas Год назад
I'm just hoping that AMD won't disappoint us
@aaditya4619
@aaditya4619 Год назад
Bro AMD is just synonym of disappointment
@Zapdos0145
@Zapdos0145 Год назад
the only way they could is if they increase prices by a stupid amount. or capacitor problems… the expectations are that low.
@raawesome3851
@raawesome3851 Год назад
@@aaditya4619 not really. Their CPUs are good, gpus are fine, at least in the high end.
@EldenLord.
@EldenLord. Год назад
They wont reach the 4090
@aboveaveragebayleaf9216
@aboveaveragebayleaf9216 Год назад
How so? They had a pretty competitive lineup of gpus depending on your specific needs/desires.
@SgtRamen69
@SgtRamen69 Год назад
Honestly I find the lack of Display Port 2.0 more concerning than the price since you atleast get some insane performance for it, but not being able to take full advantage of it for high refresh rate 4k gaming is pretty stupid lol
@Fizz-Pop
@Fizz-Pop Год назад
I wonder if the 4090TI will have Display Port 2.0 when it goes on sale...🤔
@STiStein
@STiStein Год назад
Seriously, hearing that makes me want to not buy one.
@prich0382
@prich0382 Год назад
The 3000 series was meant to have it, Nvidia trying to cheap out as much as possible
@hotlocalbabes
@hotlocalbabes Год назад
It's got an HDMI 2.1a port exactly for that reason my guy.
@bossofthisgym3945
@bossofthisgym3945 Год назад
@@hotlocalbabes HDMI 2.1a doesnt support more than 4k 120HZ too so what are you trying to say "my guy"?
@eplugplay8409
@eplugplay8409 Год назад
I fit the 4090 Fe into my nr200p and with good airflow and fan configurations runs really cool.
@xirtus
@xirtus Год назад
I expect a sub 10 litre small form factor case with amazing cooling for DUAL 4090's by end of year.
@nazaryn
@nazaryn Год назад
Anthony is such a great host -- clear, concise, covered all the bases, mentioned the case average temperatures, testing conditions, ambient air temperature, etc
@xenox8553
@xenox8553 Год назад
Almost like they have writers.. or whatever they are called...
@bigfoot3322
@bigfoot3322 Год назад
Yeah im always intrigued when a Cane toad hosts a show.
@Wes_Trippy4life
@Wes_Trippy4life Год назад
@@bigfoot3322 😭💀💀💀💀
@TinkyTheCat
@TinkyTheCat Год назад
​@@xenox8553 Anthony is listed as both the episode's host and writer on the end slate, though that doesn't rule out others helping him. At any rate, while these videos are collaborative efforts, it's perfectly normal to have a favorite host(s), as they each have their own delivery.
@ypsilondaone
@ypsilondaone Год назад
and delivering wrong info
@PhobiaSoft
@PhobiaSoft Год назад
At this rate, I legitimately think my next GPU is going to be made by Intel. What a fascinating turnaround for them.
@velqt
@velqt Год назад
Intel gpus are a synonym for hot garbage
@nadie9058
@nadie9058 Год назад
@@velqt Right now they are, maybe in a couple of generations they can actually compete.
@velqt
@velqt Год назад
@@nadie9058 Intel will cry and give up before that happens. They’re already in damage control
@oisiaa
@oisiaa Год назад
Yes. Everyone needs to go Intel to send a message about pricing. Let Intel undercut the market.
@velqt
@velqt Год назад
@@oisiaa Ah you want to pay for hot garbage so intel prices even higher when they realize what people will pay for trash?
@kasperdahlin6675
@kasperdahlin6675 Год назад
Notice the minecraft ”uoh!” Sound when he says cooling att 10:48 😂😂😂😂
@IVMRGREENXX
@IVMRGREENXX Год назад
0:27 " there are some other problems"...card disintegrates
@falrexion7709
@falrexion7709 Год назад
I am so disappointed we haven't seen a set of cards with a good balance of power use to performance since the 10 series
@josdebosduif5
@josdebosduif5 Год назад
Check DerBauer's review, he plays around with the power target and that shows some impressive efficiency results!
@MHWGamer
@MHWGamer Год назад
do what the other said. 70% power target => 300 W and onlx -5% in fps. For a 300W card with this performance, it is bonkers.
@danebeee1
@danebeee1 Год назад
@@josdebosduif5 this is exactly what I was gonna say. The power draw on this card is actually insanely good compared to everything else out there including nvidia’s 3090 and 3090ti.
@bojinglebells
@bojinglebells Год назад
its competition. The 20 series might not have been as much of an improvement because nVidia ballooned the size of their chips in order to fit in all the extra RT/Tensor stuff, but the 30 series returned more to normalcy, however actual threat of competition from AMD lead them to be as aggressive as possible with performance with less regard for how power they are targeting per tier. Power/performance has never really gotten worse since the 10 series, you just have to be willing to apply your own limits to the cards. Take a 2080, tune it to draw no more power than a 1080, it will still be plenty faster. The 3080, even more so. What will be really interesting to see is the 12GB "4080", which has a die size smaller than the 1080 (and is barely larger than a 3050/60), but has more transistors than a 3090 Ti. Its just a major shame they're trying to fleece us for it with that absurd $900 MSRP.
@Lothyde
@Lothyde Год назад
All of this hardware (both CPU's and GPU's) is already extremely efficient, manufactures just crank the hardware to its limit and the power increases exponentially when its close to the limit.
@shadowlemon69
@shadowlemon69 Год назад
RTX 4090's raster performance and with DLSS is pretty impressive, but for a whopping 1599$, I could buy a whole PC for that price, and with it's 450W+ TDP, even that 1600$ PC could consume less or same power as the RTX 4090 would
@mudgie0205
@mudgie0205 Год назад
Well that’s just, like, your opinion man…
@vongdong10
@vongdong10 Год назад
Yeah but will that whole pc have the same performance?
@TropicalCyc
@TropicalCyc Год назад
@@vongdong10 yes :)
@surfalcatraz9770
@surfalcatraz9770 Год назад
@paradox I bought it. I will legit have a heart attack if amd has a better card for a alower cost because the 4090 was hella fucking expensive AND i had to stretch my budget A LOT.
@watawatan0w
@watawatan0w Год назад
but what's gonna drive it kid? a 1060?
@thanutz
@thanutz Год назад
Started and new build and contemplated this or the 4080, was able to get the 4090 at retail which is still sky high but I think it will come in handy for rendering, stacking/editing deep space photos and gaming
@WolfetonePickups
@WolfetonePickups Год назад
How about a GPU comparison for 3D scanning objects using scanners such as the Einstar? Seriously wondering how the 4090 will perform with one of this, it the 16 and 24gb variants.
@MarkLoganFIB
@MarkLoganFIB Год назад
This is reminding me of when 1080p was really demanding. Finally we're going to get cards running 4k like it's kot even 4k
@momomimi8957
@momomimi8957 Год назад
1080p now doesn't even make the honorable mention section of the benchmarks.
@lejoshmont2093
@lejoshmont2093 Год назад
I remember people years ago gaming at 4k although at low frame rates.
@bradhaines3142
@bradhaines3142 Год назад
30 series was validly capable of 4k. at the time 10 series could run almost any game in 4k too (around 30 fps)
@MrEpic-97
@MrEpic-97 Год назад
I gamed at 4k60 with a 1070...just dont crank everything to ultra. Set to medium-high and most games ran at 60fps
@MrBenenator
@MrBenenator Год назад
@@momomimi8957 You made my HP ZR24w cry. You monster. /lh /j
@darklyspaced
@darklyspaced Год назад
I have to be honest, the 40 series really isn't worth it at this moment, especially with the recent reduction in the price of the 30 series gpus.
@manjindersinghsaini911
@manjindersinghsaini911 Год назад
Not just 30series but rx6900 xt is going around for 699!!! For its power consumption, this is no Brainer
@propersod2390
@propersod2390 Год назад
@@manjindersinghsaini911 6900xt is a joke bro imagine buying that 💀💀 the 6950xt had half the performance of the 3090 in most games. Even worse in productivity. What a joke
@researchandbuild1751
@researchandbuild1751 Год назад
It's literally 2 times faster for productivity use cases. 2 times! Yes that is worth it. Time = money
@vidyamancer7135
@vidyamancer7135 Год назад
@@propersod2390 LOL pulling numbers out of your ass.
@harrylesueur
@harrylesueur Год назад
@@propersod2390 not sure what games you've been playing... (*Psst* how much is Nvidia paying you for this, I want in)
@gabe20244
@gabe20244 Год назад
I'm a simple man, I see Anthony, I click video. I know it gets shot down all the time, but would love to see a Linux Tech Tips with Anthony if he's willing to do it.
@CenobiteBeldar
@CenobiteBeldar Год назад
I wish you guys could do a video on the laptop version of the 4090 since articles say it shouldn't even be called that considering it's not comparable to its desktop version.
@blaze8897
@blaze8897 Год назад
I swear, Nvidia just radiates so much utter contempt for their customers, partners, and worldwide power grids that it's actually giving Apple a run for their money.
@appixx
@appixx Год назад
Facts, I really hope AMD can pull though this year, because this is not okay
@lasthopelost9090
@lasthopelost9090 Год назад
How about dealing with crypto before we start complaining about the power grids
@IsraelWokoh
@IsraelWokoh Год назад
Nintendo: "Hold my lawsuit."
@robburgundy9539
@robburgundy9539 Год назад
They are the Apple of pc parts. Nvidia is a lifestyle at this point, they succeeded.
@wanderingwobb6300
@wanderingwobb6300 Год назад
It's hilarious since Apple hates Nvidia too. They're a match made in hell.
@catalyst429
@catalyst429 Год назад
the displayport 2.0 thing is insane. With a card like that you want it to be relevant for as long as possible for that price point and its starting off a gen behind what a joke.
@Somber.Killer
@Somber.Killer Год назад
It's all on purpose. People who are gonna purchase this card will buy the next 4090ti or 5090 with dp2.0 without hesitation. Giving Nvidia their money hand over fist.
@ShogoKawada123
@ShogoKawada123 Год назад
No company has even announced a DP 2.0 display though, AFAIK
@ShogoKawada123
@ShogoKawada123 Год назад
@@primegamer321 When? AFAIK there literally isn't a single model available for sale currently, most of the really good displays just have HDMI 2.1 and DP 1.4a
@frankwoolrich
@frankwoolrich Год назад
great to see Anthony Back, Was thinking only the other day not seen him in front of the camera, he is so natural and explains things so well.
@jesusbarrera6916
@jesusbarrera6916 Год назад
the most incredible thing about this review is the fact that it took you hours to correct you mess up and the video is still up.....
@kauahatu
@kauahatu Год назад
Could you guys test wether or not you can heat your room with one of those? Maybe that is the solution to the rising energy prices.
@pvpdm
@pvpdm Год назад
If you can afford one of those you can afford the high energy prices
@crashtestdummy9985
@crashtestdummy9985 Год назад
I mean that thing looks like it could shit out a way bigger power bill than your actual HVAC system honestly.
@DiamondTear
@DiamondTear Год назад
Around here we're heating a whole city with a supercomputer...
@Bruno_Laion
@Bruno_Laion Год назад
@@pvpdm pretty sure it was a joke
@Maxtraxv3
@Maxtraxv3 Год назад
bruah i can heat my room with a rtx 2070 super
@Dash20
@Dash20 Год назад
What was the length of your DaVinci Resolve timeline? I'd love to get a handle on render speed in FPS.
@maaninou
@maaninou Год назад
For counter strike I think it’s some sort of memory heat/throttle or bandwidth saturation, try to limit fps to 450 you may see better results. And maybe more test with newer drivers /better cpus..
@JimmyMcThiccus
@JimmyMcThiccus Год назад
It was good having the second fastest gpu for 3 days. But I did get the 3090 for 800 bucks. I’m also upgrading to a 5800x3d today up from the 3600.
@themightypizzadevourer6018
@themightypizzadevourer6018 Год назад
Hello brother
@simonic2063
@simonic2063 Год назад
I went with a sim racing 3080 for $400. Unless I absolutely need the fps -- I have a hard time justifying >$500. Things like this definitely keep the console market alive.
@idwithheld5213
@idwithheld5213 Год назад
I would 100% wait for the 13900k before changing platforms/cpu's.
@Krydolph
@Krydolph Год назад
@@idwithheld5213 the 5800 X3D is a dropin replacement for his 3600, no new motherboard or anything else needed, and it is one of the absolute best gaming CPUs, maybe the 13900k can beat it on paper, but there is no smarter or better upgrade path for him right now, unless money means nothing, and for you giving "advice" ofc. someone elses money means nothing to you!
@jonsnow2555
@jonsnow2555 Год назад
@William B time to buy online from u.s.
@quanicle101
@quanicle101 Год назад
the fact is that you never really NEED the fastest gpu to play new games at the highest settings. you can go a few rungs down (even a few generations) and still do just fine. after a certain point the added performance isn’t enough to justify the price
@kiloneie
@kiloneie Год назад
I am sure you are talking about 1080p here, which can run high/very high settings 60 fps on bloody any GPU. The moment you go 1440p the requirements jump quite a bit, and when you add a 144Hz monitor it just even more, WAY more than the resolution increase, that is where i am at, my 1070Ti cannot run anything at 144 FPS, it could barely run Fortnite at very high when i bought it, but sure it's just one game, PUBG i cannot reach good quality and over 100 FPS, it's a trade of, same with Warzone, etc. Then you have people, i will be one of those one day, that are at 4K, and some aren't even at 4K 60 FPS, but at more, and NOTHING but this 4090 can run 4K at above 100 FPS. I am not advocating for such absurd prices or the 4090, but your statement is wildly incorrect. When someone tries and plays 1440p for a while, they don't want to go back to 1080p, same with 144Hz vs 60 FPS, and just reaching 1440p 144FPS is bloody absurd, and that isn't even that new or premium or whatever, 4K and refresh rates in the hundreds are the niche. Yes a lot of people play at 1080p 60FPS, but a lot of them never even tried anything higher.
@iawindowss4061
@iawindowss4061 Год назад
That's how I have been feeling too, I am just not that interested or exited for the next generation the same way I was with the 6 series or 7 series.
@mattshaw4016
@mattshaw4016 Год назад
I think it only matters to eSports gamers who need max frame rates and enthusiasts who like using all the settings to the max. I honestly turn off a lot of setting in every game which I believe ads bloating and a tacky look 🤷. But definitely would be nice if you had a lot of money 👍
@felcas
@felcas Год назад
You don't need a Ferrari to get from point A to point B either. But it is nice to drive a Ferrari 😄
@alexisrivera200xable
@alexisrivera200xable Год назад
Correct, on most games the jump from 1440p to 4K is not that noticeable, same between the high and ultra presets of most games. We literally dump electricity for very diminishing returns with most people unable to even tell the difference. (But swear up and down that they can which drives a lot of toxic elitism.) The chase for the ultimate hardware has gotten real mindless at this point with bragging rights the only metric that matters. All to play the same single player games that thrive at 60fps and the sweatfest online games like Apex legends and COD Warzone that look the same plus are so overwhelmingly infested with cheaters that an investment in the fastest GPU money can buy at the grossly inflated margins Nvidia asks for is completely unjustifiable.
@lasarith2
@lasarith2 Год назад
I wonder what performance would be on older cards with the new die (transistor size) would be (also cost)
@cjpartridge
@cjpartridge Год назад
Unbelievable this only has Display Port 1.4, classic Nvidia move.
@Secret_Takodachi
@Secret_Takodachi Год назад
The most impressive GPU I've seen in the last 12 months is the one Valve used in their Steam Deck, the performance that thing achieves is impressive!
@CaptainScorpio24
@CaptainScorpio24 Год назад
which one
@austinnafziger4159
@austinnafziger4159 Год назад
@@CaptainScorpio24 It's an integrated RDNA 2 based GPU with 8 CUs.
@CaptainScorpio24
@CaptainScorpio24 Год назад
@@austinnafziger4159 ohhh ok
@esprit101
@esprit101 Год назад
Add the 40hz mode to that. It's no 60 but it feels really good compared to 30hz. I've played Valheim, Satisfactory and Cyberpunk, all felt really good for such a lightweight device.
@propersod2390
@propersod2390 Год назад
4090 probs has about 10x more performance. So impressive 🤨🥱
@Kokorogamer1
@Kokorogamer1 Год назад
They did the same thing when they didn’t support the HDMI 2.1 on the 20 series cards which I missed when I bought the LGC9 to the point of thinking I should upgrade to the 30s series. So yeah I’m not gonna fall for the same mistake again, future proofing is a must so I will wait and see what AMD gonna offer it feels like they did it so they can add it on the next generation to justify upgrading not to save a buck
@GregAtlas
@GregAtlas Год назад
The PCI-E gen 4 thing doesn't bother me, but not supporting display port 2.0 does considering there are already monitors that do support it.
@MagusGod
@MagusGod Год назад
Great info here. I think I'll just skipped the 4090s since I got the 3090s. Gotta see what the 5000 series will look like. Thanks.
@szaszm_
@szaszm_ Год назад
The labs should test slightly undervolted and underclocked performance, like what can you get out of it for 350W? What about 300W? And about the SFF comment: that card in itself is larger than my SFF build. 😄
@smaxfpv1337
@smaxfpv1337 Год назад
You should check the Computerbase (German) test for that. It’s super super impressive and I’m honestly disappointed LTT didn’t mention anything about it- the 4090 is VERY efficient and loses less than 5% avg at 350W max and still below double percentage at 300W. For me the most impressive point and they’re not talking about it.
@szaszm_
@szaszm_ Год назад
@@smaxfpv1337 Nice, thanks for mentioning the article. I'm impressed that they managed to achieve this without undervolting. The card looks like a much better value knowing this, although the lack of DP2.0 is still a bummer.
@superneenjaa718
@superneenjaa718 Год назад
@@smaxfpv1337 that makes no sense. Nvidia would have released 350W cards if it lost only 5%. Those guys probably tested an slightly unstable settings.
@holladiewal6812
@holladiewal6812 Год назад
@@superneenjaa718 Der8auer made a similar video, and had negligble performance loss when adjusting the power target down to 60% and was able to play several games and run through the time spy extreme benchmark successfully. Whilst that is still no perfect long-term stability test, it seems to be stable for now.
@superneenjaa718
@superneenjaa718 Год назад
@@holladiewal6812 I think it's like Zen4 undervolting. Hardware unboxed found out that 30 (highest) undervolt was stable for everything like cinebench or games, but blender crashed for anything above 5. The manufacturers perfectly know the specs and set power and voltage target accordingly for perfect stability and longevity.
@Night_Hawk_475
@Night_Hawk_475 Год назад
Edit: Anthony has posted in the comments about this issue and confirmed it wasn't DLSS, but a different setting, that wasn't setting correctly, new numbers which are closer to what we see in other games were shared. Still a big uplift from the prior generation though :) I'm curious now how the other 40xx cards which are closer to the current 30xx pricepoints will compare to their past generation's numbers. The rest of my comment can be disregarded in light of the corrections coming from Anthony/LTT. There was another reviewer who mentioned that cyberpunk seemed to have issues with changing settings and it took several attempts of changing the setting and restarting the game to get it to run with DLSS off on the new Nvidia cards. DLSS being the thing that "creates fake frames", it severely affects framerate measurements if it's not actually turning off in this measurement. I'm not saying that has to have happened here, but it wouldn't surprise me. I would love if you could double check it for us :c
@SweetJesus16
@SweetJesus16 Год назад
Clealy it happned and they probably know but instead of talking the video down they continue to spread misinformation
@kopilovicd
@kopilovicd Год назад
That has to be what happened. The performance in Cyberpunk shown here without DLSS is not consistent with other reviewers.
@mrlacksoriginality4877
@mrlacksoriginality4877 Год назад
They said they were suing older drivers because new drivers were faulty at the start of that. That could be the difference, the also didn't use TAA in this video like DF. They showed with DLSS 2.x on and off in this video. With ray tracing and DLSS performance it was higher than with DLSS off and ray tracing off.
@Night_Hawk_475
@Night_Hawk_475 Год назад
@@mrlacksoriginality4877 The charts said DLSS off on both versions (and he clearly says "without DLSS" @5:35). Only RTX was on in the second graph, not DLSS. But if DLSS had secretly been on in both (not even like they meant to leave it on, just because the drivers/cyberpunk have a known issue with leaving it on even though the settings say it's off) it would inflate the framerates drastically like this and invalidate the test results. If they had a separate measure where they claimed DLSS was on it would be helpful, but there were none in this video so I'm not sure what you're referring to when you say you saw a comparison with DLSS on in Cyberpunk from this channel. TL;DR: DLSS 3.0 (not 2.0) uses an AI to "guess" what frames should look like, and inserts a "fake" frame between each pair of "real" frames. It does this at the cost of adding measurable latency and a risk for visual artifacting / loss of clarity (as the reviewers noted in this video). I suspect the average player may prefer to play games with DLSS 2.0 instead of 3.0, or with DLSS off all together. Regardless though, if a game (like cyberpunk) were to accidentally run with DLSS 3.0 enabled, even though the reviewers checked and thought it was off, then that game would be able to roughly double it's FPS. And in comparison to the performance increase of other games which weren't lying about whether DLSS was on or not, it would show a much more massive performance jump from generation to generation. Which... is exactly what we see here, if you halve the 40-series performance in cyberpunk it's still a big increase, but it's much closer to the performance increase we see in the other games being benchmarked in this video.
@mrlacksoriginality4877
@mrlacksoriginality4877 Год назад
@@Night_Hawk_475 They just posted the issue. It didnt have DLSS on but fidelity upscaling on because of some technical issues. At the 6 minute mark you can see DLSS on btw.
@tontechnick9363
@tontechnick9363 Год назад
great video. I really enjoyed it. However, for some odd reason there was a clicking noise starting around 8 minutes in and ending around 10 minutes in. It sounded like a clock and was of putting enough for me to almost stop watching the video which is sad because the content and how Anthony presented was just absolutely on point and very interesting.
@iachimotdk1056
@iachimotdk1056 10 месяцев назад
Seriously, this video is still up saying that the 4090 is 60% faster than the 3090 Ti? Come on LMG. This is a very bad look.
@datcheesecakeboi6745
@datcheesecakeboi6745 10 месяцев назад
"We will remove the videos" Keeps up all the worse offenders
@Zapdos0145
@Zapdos0145 Год назад
can you believe we would rather intel enter this market to bring competition and theoretically help “fix” this problem than see nvidia continue this path. it’s actually amazing… and im here for it. i hope AMD and Intels drivers come to play ball.
@ArtisChronicles
@ArtisChronicles Год назад
A race with no end in sight
@Zapdos0145
@Zapdos0145 Год назад
@@RyTrapp0 i meant AMD (themselves) and intels drivers as two separate things, i should’ve clarified.
@Mr.Morden
@Mr.Morden Год назад
That right there is why Nvidia wasn't permitted by regulating agencies to acquire ARM. Jenny demands total ownership and control, anything less is inadequate for him. He is just like Steve Jobs.
@cookiewriter4001
@cookiewriter4001 Год назад
If the RDNA3 x800XT can stay below or at 800$ and delivers comparable results it will be the new High End King. the 40 series lacks competitive pricing. 700$ was right at the edge for most gamers. I had friends that bought the 3080 with money saved up over the summer. I can't see that happen with a 1200$ card in the same class just 1 generation later.
@LetrixAR
@LetrixAR Год назад
If this card wasn't a 4090 but a Titan or for workstations, almost no one would complain about the price.
@tag206
@tag206 Год назад
Now they can save up 900$ and get a 70 class card.
@CptVein
@CptVein Год назад
How is AMD suppose to match these performances for more than half the price?
@GlorifiedGremlin
@GlorifiedGremlin Год назад
@@LetrixAR But.. it's not tho. So why bring it up lol
@tag206
@tag206 Год назад
@@CptVein Not match, but if they can give us even 90% of a 4090 for 800$, they pretty much win this gen.
@TwistedEyes12
@TwistedEyes12 Год назад
I really appreciate showing off the benchmarks with DLSS OFF. I'm not against using DLSS personally, but my "goal" is always to play without it if possible.
@lasarousi
@lasarousi Год назад
Game graphics have been in a stalemate for so long that these gimmicks seem more worrying than exciting. Anything above 1440p is just unnecessary unless you're display is over 70'
@hopey1809
@hopey1809 Год назад
why tho?
@taleg1
@taleg1 Год назад
From the testing I have noticed, I have drawn one conclusion. It's too early to buy yet as the drivers are not mature. Sure the 4090 is a very good card, how good though is something we won't see until the drivers get upgraded a few times. But it will be fun to see a second test containing 4090, 4090, if exists 4070 being tested alongside new RDNA 3 cards around 6 months after the release when the drivers has been fixed, upgraded and tuned. The testing at that time should give us all a good show of who has the best card to buy or the king of the junkyard or... Well you get the idea. Right now, the drivers are just out of their beta version and it will take time to get drivers to mature to really show what the cards can do. And I can't wait to see those tests around my birthday in May 2023, when the drivers has matured a bit.
@Reac2
@Reac2 Год назад
With prices ever rising ,I think it's time that tech youtube starts to refer to GPUs not by their names, but instead by the most useful thing you can buy for the same price. This Honda Prelude review is pretty good ,for example
@davidplesnik6428
@davidplesnik6428 Год назад
😆
@MongooseTacticool
@MongooseTacticool Год назад
"This card is X rent/mortgage payments worth."
@Najolve
@Najolve Год назад
I do my personal economics in terms of sandwiches. So to buy a 4090 I just have to starve for a decade.
@Standard.Candle
@Standard.Candle Год назад
Thank you Anthony for finally being the voice of reason on the lack of DP 2.0 Maybe if this was discussed earlier and there was more of a consensus among reviewers we could have pushed back on this anti-consumer tactic. I'm sure nVidia will be more than willing to sell me a 4090ti for $2099 in 6 months that actually supports DP 2.0
@Sal3600
@Sal3600 Год назад
Keep crying lmaoo
@platinumjsi
@platinumjsi Год назад
Surprised he didnt mention DSC which enables higher refresh rates / resolutions on DP1.4 without the image quality degradation of Chroma Subsampling.
@FAQUERETERMAX
@FAQUERETERMAX Год назад
Yeah, I'm skipping this generation. If I buy a bleeding edge graphics card I want a bleeding edge screen
@platinumjsi
@platinumjsi Год назад
@Alex Unltd. 160hz 10 bit HDR works fine with DSC
@MistyKathrine
@MistyKathrine Год назад
@@platinumjsi Can just use the HDMI 2.1 which is probably what most people getting this card will be using, though I question why there is just 1 HDMI 2.1 port but 3 DP 1.4 ports.
@ambientnaturally
@ambientnaturally Год назад
I'm using a 5700xt processing for hours at a time. I think one of these would actually save energy by getting the same amount of work done faster, and also help me get done before the rates go higher at 4PM.
@masterdammus
@masterdammus Год назад
Can you put out ghe sepcs of the rig where you do the tests on? Thanks
@Emmo76
@Emmo76 Год назад
Did I actually make it here on time for a LTT video?
@georgecy5937
@georgecy5937 Год назад
yes we did, almost I guess
@Bukki13
@Bukki13 Год назад
yes
@k9chilly5
@k9chilly5 Год назад
ok, these test results are VASTLY different (referring to 4k, ultra, no DLSS, RT Ultra) to what some other reviewers are getting. Not slamming this video at all, just a reminder to watch other reviews. Test benches definitely impact the results it seems. Wild card, wild money. Will be waiting lolol Edit: good grief, this the most interaction I've ever had with my comment lol but yeah, almost 4x increase on that Cyberpunk performance when others have around a 40% increase (which is still bonkers) definitely raises an eyebrow. Gonna be an interesting WAN show I guess lolol
@fjb666
@fjb666 Год назад
Don't forget random silicon quality differences.
@reloadingdontshoot1
@reloadingdontshoot1 Год назад
Wait for Jufes from Frame Chasers results. Shows the in game tests and not just bs graphs anyone coulda thrown together
@FOGoticus
@FOGoticus Год назад
Der8auer saw some really impressive numbers even when limiting the GPU to about 60% power draw.
@zshadows
@zshadows Год назад
Yeah, they are claiming about double in Cyberpunk 2077 what other reviewers got. Something is up.
@Magnus0891
@Magnus0891 Год назад
check out igors lab test report... he tested the card with an AMD 7950 CPU (the fastest current CPU you can get) and the only CPU it should be tested with as of today (Intel Raptor lake 13900K, has not released yet) The videos from igors lab may be in german, but his tests also come with an english translated 12 page article with all graphs benchmarks etc in english on his website. He is probably the most repected tester/ reviewer around in the industry, even linus tech tips, jay C 2 cents, gamer nexus, der 8auer and other reviewers constantly refer to his testings and give him credit for his work.
@platinumjsi
@platinumjsi Год назад
Chroma subsampling is not the only option for high res / refresh / HDR on DP 1.4, Display Stream Compresion is a thing and allows higher resolutions / refresh rates with virtually no impact on the image quality. Surprised LTT / Anthony missed this TBH.
@jace_Henderson
@jace_Henderson Год назад
I was really hoping we’d get something like the 3000 series did where it beats the previous gens performance by a good bit for much less, I was expecting at least not too much more than 3000’s msrp’s, but now we’re literally back up to rtx 2000 msrp prices at best.
@decoder55killer
@decoder55killer Год назад
I remember when the flagship gpu 5-6 years ago (980 ti asus rog) was $650… no one can’t say that this performance isn’t amazing but the price is just ridiculous. While other hardware manufacturers have maintained the price or even lower it for better performance, nvidia has tripled the price
@neoasura
@neoasura Год назад
I paid $500 for a Geforce 2 Ultra in 2000, which is close to $800 today, it was outdated in less than a year. Kids today don't realize they had a good run of cheap components for the past decade, those days are over. Chinese workers don't want to be slaves anymore.
@big_matt3496
@big_matt3496 Год назад
Try getting 16000 cuda cores for 1600 5-6 years ago.
@GeneralKenobi69420
@GeneralKenobi69420 Год назад
Now compare the cost per performance instead of cost alone. You'd be surprised by how much the 4090 utterly dunks on the 980Ti
@BlatentlyFakeName
@BlatentlyFakeName Год назад
They claim it's "inflation", but Nvidia are making even more obscene profits than ever so it's clearly a con.
@zahidshabir4038
@zahidshabir4038 Год назад
Yes price has increased a lot but also even price to performance increases every gen and in most cases also performance per watt too. I too hate that the prices are so high BUT you got to consider the fact that back then they didn't have a 90 series card nor was covid a thing which drastically affected inflation as well as costs of items and so on and so forth. Here in the UK for example I could easily get 2 Litres of milk for £1.50 back in like 2016/17 now its more like £2 minimum for the same amount of milk another thing that I remember the price of back in those days was a 2 Litre bottle of branded soda/fizzy drink such as Coca Cola/Pepsi/Fanta etc... which were always around £1 minimum (except for Coca Cola which would only be that much when it happened to be on offer) and I know there is a tax on sugary drinks now but sweeteners aren't taxed like sugar and back then both sugary and sugar free were around the same price and even sugar free which BTW is I think supposed to be around 20% cheaper (don't know the exact number) is now around £1.50 for 2 Litre bottles and for Coca Cola which is usually the most expensive common name brand you can only get it that cheap easily is if you get it on offer as part of a multi buy deal (such as buying more than one bottle such as 2 for £3) Inflation has ruined the market but even then NVidia is pricing these a little too high I think. I think including inflation a XX80Ti series nowadays should be around maximum $900 MSRP and the regular XX80 should be like $750 max and the reason I say those prices is because of inflation
@TKSubDude
@TKSubDude Год назад
Power and size limits make this monster of a card with a monster price tag a "NO-GO". It will remain an enthusiest card with few in use world wide.
@shippy1001
@shippy1001 Год назад
Not really, it will fly off the shelves, don`t underestimate how much money people are willing to spend on pc-hardware worldwide, if this was priced at $2999 it would still sell out, not in all countries, but US and EU for sure. For business use is a no-brainer, time is money and this cuts a lot of time so the price is not a deciding factor, for streamers and content creators who always want/needs the latest and greatest it will also be a no-brainer, basically anyone but people on a budget, and those people should be buying the 30x according to Nvidia, if you divide FPS/$$$ the 4090 is the cheapest GPU you can get so $1199 3090ti or $1599 4090 that has 2.2x the performance? If we start seeing 3090ti`s priced at around $700-800 USD, than yeah no reason to get a 4090, but that`s not the scenario today.
@whdgk95
@whdgk95 Год назад
@@shippy1001 it'll do as well as a niche product will like the thread ripper. That was popular and commercially viable too, but definitely not moving the same volumes as the other Ryzen models. Dedicated servers and workspaces likely won't switch to this from their workspace cards like quadro MI series. Supercomputers that use the MI series already is not switching to this. Many of those partners are contractually bound as well. The rtx4000 is for enthusiasts, not for businesses. Streamers, like you pointed out, probably have the most draw, but honestly they can just get a cheaper gpu as a hw accelerator for their streaming purposes for a much smarter decision. So sheep will prob buy these, but they prob won't be moving the same volume as previous gens for sure.
@ggwp638BC
@ggwp638BC Год назад
@@shippy1001 It will fly off the shelves because, as nVidia already said, they are keep inventory low to create a false scarcity. For business... it depends. If you're large enough that you need it, you're also likely large enough to be looking for Quadros. Streamers can justify it somewhat, but most professional streamers already run double setups to the point this doesn't mean much. Plus the extra image quality won't be translated to the streams anyway. On top of that at some point you just have to factor in the rising energy costs and cooling solutions. It is a good product in regards that if you have the money and a very specific use-case that will benefit from the extra performance, and don't care about energy or heat, it can be beneficial, as long as you're not large enough for business solutions. But at the same type the actual market that checks all these boxes is very, very small. A good product under certain circumstances? Yes. A no-brainer? No - specially because the lack of Displayport is one massive downgrade, specially for streamers, who are the people who would mostly benefit from this card. Add to it that it doesn't have support for PCiE 5, which y'know, is quite important if you want highbandwith to, I don't know, move around large pictures in high resolutions with minimal compression and very high speeds, and you might have your buyers wondering if they really should invest or wait for next year. The 4090 is an enthusiast/halo product. It's something to sell the brand, and nVidia is making sure to price people out of it in order to become the de-facto "premium" brand, like Apple does.
@whdgk95
@whdgk95 Год назад
@@kenhew4641 Are you unaware that there are separate enterprise line of products in Intel, AMD, and NVIDIA? US Government has already invested into AMD MIs, EPYCs, and NVidia's A100 GPUs. THOSE are enterprise market products, not the RTX4000 LMAO. Thats exactly what makes these RTX4000 cards such a niche product like the threadripper.
@shippy1001
@shippy1001 Год назад
@@whdgk95 That`s a different scenario, and those Ryzen situations yes you are absolutely correct, but for Rendering, VFX, and even streaming, the 4090 is still the much better value proposition, and don`t get too caught up with the DP1.4 thing, it has HDMI 2.1, even Nvidia knows that is a niche to use PCIe5.0 and DP2.0. Most professional office environment like indie game devs or VFX artists runs dedicated PC desks around a warehouse, they simply buy a new system for the best developers and move the 1-2-year-old system to the newer guys, and the 4-year-old systems are sold/traded. The dedicated HW accelerator is too much of a hassle to deal with, a single powerful GPU is much easier to work with and you will get more value out of that. Just to be clear, I`m not defending Nvidia practices, just explaining that this product even priced as it is right now, is still a good deal for people/businesses who got the money.
@MrOffTrail
@MrOffTrail Год назад
I’m glad someone is finally talking about this, well done. Sticking with 1.4a for a third generation in a row when DP 2.0 has been out for over 3 years is an odd way to cheap-out on a very expensive card. In addition to 8K displays and high-refresh 4K HDR displays needing the extra bandwidth, and the 4090 being able to make use of it, higher resolution & refresh rate VR headsets are on the way, such as the Pimax 12K (6K each eye, reportedly 200 Hz) that would certainly benefit from DP 2.0. Reports today also allege that AMD’s RDNA 3 cards will not just support DP 2.0 as previously reported, but may support DP 2.1 (which hasn’t yet been publicly announced). Things should get interesting soon. I’m holding off to see RDNA3 vs 4000 series head-to-head real-world benchmark comparisons before I buy.
@vinn7944
@vinn7944 Год назад
Is there not an adapter for this?
@Sevastous
@Sevastous Год назад
@@vinn7944 nope. Because the output displayport1.4 socket that is directly connected to the gpu pcb is the bottleneck. No adaptar can work around this
@Condleo
@Condleo Год назад
How did you guys get those results on cyberpunk I have seen other reviews and no one is getting that kind of results
@ChristopherHallett
@ChristopherHallett Год назад
In a few years when the 6090 launches, hopefully competition will have forced prices back down closer to $1000, and I might be able to justify buying one so I can start gaming in 4K.
@koalakenbymacy9248
@koalakenbymacy9248 Год назад
The 6060 should be able to easily handle 4K. I'll be incredibly surprised if the 6060 doesn't have better performance than the 3090ti
@hunchie
@hunchie Год назад
Because of inflation, PC parts are not going to get cheaper.
@ignacio6454
@ignacio6454 Год назад
If moore's law is dead, be ready to pay 5000$ for the 6090 :)
@steve_bisset
@steve_bisset Год назад
To be fair, I see the prices coming down. They can only keep charging so much when AI upscaling starts to do most of the work. The actual hardware is likely to plateau because there's just no need for monster cards with 600w power draw when the majority of the frames are software/AI based.
@kevin-sj3wt
@kevin-sj3wt Год назад
@@ignacio6454 yeah or $500 since everyone not buying it coz electricity is getting more expensive, and more GPU power wasted for nothing wont attract anyone
@thegeekno72
@thegeekno72 Год назад
What I took out of this video are the impressive stats for the 6950X : not a perfect card, got it's pros and cons but the margins are smaller than most think and knowing it was cheaper by 50% than the 3090 and it's trading blows with the 3090Ti which costs almost double in many titles is a big W for gamers and got me really excited for RDNA3 coming up very very soon ! Been riding with NVidia so far but I've been considering jumping ship these last few years
@davisbradford7438
@davisbradford7438 Год назад
And in some applications it slays. 7:54.
@ABaumstumpf
@ABaumstumpf Год назад
Well - higher rasterisation performance, lower Raytracing performance, better general purpose productivity, worse when compared against Cuda.
@I_Like__bananas
@I_Like__bananas Год назад
AMD GPU really age fine because of the driver updates. You can see it as a good point or just see amd cards as unfinished when they come out. Vega GPUs became a lot better with new drivers as an exemple
@unfortunatelyswagged6226
@unfortunatelyswagged6226 Год назад
@@davisbradford7438 this is almost certainly a mistake, there are many errors throughout the video
@blackknight4152
@blackknight4152 Год назад
No 2.0 support is basically making this gen an extension of the 30 series. 4090 should be flagship performance material, yet they wont offer latest tech on display port. Im also considering jumping ship after this gen for sure.
@KirkRoyster
@KirkRoyster Год назад
What is your take on the two reports of melting @ the connector (e.g. overheat and flames)? Understood that the investigations are beginning in process. But are you comfortable that if the bend radius and distance from the connector per recommendation is followed (pending upgraded connectors), that you would leave the machine in your home or business running unattended? I know, tough question, but still interested in your perspective as you are clearly loading the card significantly.
@eugenejones1994
@eugenejones1994 Год назад
I could listen to ANTHONY all day. He has so much knowledge, yet he breaks it down for us.
@kevinantonowvideo
@kevinantonowvideo Год назад
Watching this on an AGP card!
@julio_adame
@julio_adame Год назад
Wow! Good on you guys for pointing out the lack of DisplayPort 2.0 support and what that means for this GPU
@frantisekheldak2228
@frantisekheldak2228 Год назад
120 FPS enough for 4K. 120 fps and 144 is not a big difference. It makes no difference as long as you are not blind. by the way, the RTX 4090 has HDMI 2.1 which supports 4K 144HZ, so I don't know what you're talking about?
@julio_adame
@julio_adame Год назад
@@frantisekheldak2228 the 4090 performs beyond that. You've missed the point. Good day, sir.
@frantisekheldak2228
@frantisekheldak2228 Год назад
@@julio_adame HDMI 2.1 support 4K 144 HZ !
@julio_adame
@julio_adame Год назад
@@frantisekheldak2228 and the 4090 can hit 4K 240Hz performance, which is what DP2.0 is necessary for. Do some reading and educate yourself, pls, ty
@JackMooney
@JackMooney 2 месяца назад
0:26 - I thought that was a piece that fell of the GPU at perfect timing hahahaha
@wizardnotknown
@wizardnotknown Год назад
I just imagine Antony shredding on a guitar in the background.
@nathanielelijah5899
@nathanielelijah5899 Год назад
for the blender benchmarks it would be nice if you guys specified if you used Optix or not, as it can give a massive improvement over CUDA, would be good to see both sets of results.
@zsookah3
@zsookah3 Год назад
Gonna stick with 3000 series and wait for 5000 series. The nefarious pricing and irresponsible power consumption is ridiculous.
@MrPaxio
@MrPaxio Год назад
im still sticking with my 1070ti as it does great gaming in VR and 1440p, might be like 8000 series by the time i switch. always surprises me these people buy the 3090 and 4090 but still game on 1080p, like what you need that performance for? rendering hentai?
@dragonbloodeye8661
@dragonbloodeye8661 Год назад
I have the 3090 trust me the 4090 is like the 2000 series new technology and the the 3090 is the refinde version of the 2000 series that means the 5000 series is going to be a refind version of the 4000 series I would wait for the 6000 series that would be the biggest jump in performance
@ImDembe
@ImDembe Год назад
@@MrPaxio 3090 is the best 1080p card out on the market, it's more than twice the fps compared to your 1070ti in most titels, even the 1% lows are higher than a 1070ti max fps in modern games = stable preformance plus buyers don't pair a high end gpu with a garbage cpu and that have alot to do with the stable preformance in 1080p and a little less if you play 1440p. It's night and day difference but your need the monitor and cpu to support a higher end gpu to.
@ImDembe
@ImDembe Год назад
@@SweatyFeetGirl Wouldent say alot better, RTX3090 and 6950xt trade blows at 1080p and all depend on what titel. And 1080 is very cpu dependent to.
@ImDembe
@ImDembe Год назад
@@SweatyFeetGirl Might be so but how many RX69xx/RTX3090 are paired to a i7 7700 or a Ryzen 2600x? for lower tire gpus like RX6600 that is actually posible to end up in a older system.
@vmafarah9473
@vmafarah9473 Год назад
7:49 HOWDFQ 6950 BEAT 4090 IN CREO,AND CATIA NOT IN MAX?
@shughes57
@shughes57 Год назад
I'd rather have the Prelude tbh, probably better for the environment too.
@benchod3576
@benchod3576 6 месяцев назад
$1800 Prelude will cost you around $3000 bucks after all the repairs it'll need.
@Giom98
@Giom98 Год назад
In the cyberpunk benchmark JayzTwoCents got 74 fps on same settings vs your 136 fps, that cant be right? even if you tripple checked. Could there be some massive diff in cpu performance or windows stuff? I have had some cases myself in cyberpunk where the amd's 'dlss' did not disable since you have to hit apply before changing other setting for it to change
@Lenyeto
@Lenyeto Год назад
Yea, this seems crazy. I think I am just going to assume JayzTwoCents is more realistic and still get the card, if it performs better than that, then that's great too. But weird that the performance is so different.
@melodiclodgings8
@melodiclodgings8 Год назад
I feel this GPU would work well with animation, filming, medical and many other industries rather than consumer based
@SwordfighterRed
@SwordfighterRed Год назад
Isn't that more what the Quadro line is for, anyhow? Or do folks even use those?
@Jaeeden
@Jaeeden Год назад
@@SwordfighterRed as far as I can tell, the only PCs with Quadro GPUs are ones that are prebuilt specifically for those industries.
@untitled795
@untitled795 Год назад
a100 or a6000, this is pathetic compared.
@froznfire9531
@froznfire9531 Год назад
@@SwordfighterRed But many people nowadays game and work on the same PC so a card which is great in both is the way to go.
@realomegamodern
@realomegamodern Год назад
@@SwordfighterRed they are for servers
@VVayVVard
@VVayVVard Год назад
7:47 It's interesting to see how the RX 6950 XT has a massive advantage over even the RTX 4090 in some categories. Much of it is presumably a matter of optimization, but still, it goes to show that performance doesn't exist on a simple continuum. And it underscores the importance of choosing hardware optimized for your own use cases.
@MonikaPecnik
@MonikaPecnik 10 месяцев назад
Yea Really. How is nobody Talking about Those results
@knampf9779
@knampf9779 11 месяцев назад
13:50 I laughed when he pointed out all the other stuff you can buy for the same price. Good comparison. Very informative video.
@lefthornet
@lefthornet Год назад
Great video. Just one thing, when you show the DLSS, maybe you should consider showing the FSR active on the AMD side and XeSS on Intel side (Like GN did with the 4090 Review), because this technologies are so common now, and showing DLSS on and no AMD result could lead to consumer confusion, and yes DLSS technically has a little more quality, but with FSR 2.0 AMD is really close and even with 1.0 in now that far with the fixes they putted on it... Just saying, I hope who read this have a good day :D.
@dx7gaming
@dx7gaming Год назад
I don't have an explanation for the frame rate differences. If I had to guess though, I would say it might be related to the drivers. Nvidia drivers often contain specific optimizations and profiles for each game. It could explain why some games perform really well, while others perform poorly. Having the best hardware can't fix bad software and bad coding. Often bottlenecks and performance problems are purely software issues (I'm a software developer).
@AtticusHimself
@AtticusHimself Год назад
Oh ok "software developer" Was this close to calling bs until I saw that impressive title
@user-ko4zp1wm2i
@user-ko4zp1wm2i Год назад
@@AtticusHimself He's right and it is basic knowledge.
@AtticusHimself
@AtticusHimself Год назад
@@user-ko4zp1wm2i "and it is basic knowledge" that's the entire point of my comment
@thedeadlygames9716
@thedeadlygames9716 Год назад
@@AtticusHimself he is right though, you people seem to focus more on hardware and less on software which also causes bad frames. Hardware is not to blame.
@unkemptwalrus4643
@unkemptwalrus4643 Год назад
@@AtticusHimself then you worded it poorly
@flightsimdev9021
@flightsimdev9021 Год назад
I've love to test this against my A5000 which only draws 275 under load, has the same memory, although it is ECC
@Slane583
@Slane583 Год назад
I'm going to be upgrading my system to AM5, so this time around I'm going to rid of any 2.5" drives and just utilize two M.2 NVME 4.0 SSD's to rid of un-needed cables. For me it seems like no matter how good my case is I have some random cable running from something that I have to cram somewhere to get it out of the way. Since NVME ssd's are more reasonably priced now I'm going to replace my current 500GB M.2 SSD system drive with a 1TB and my current 2.5" SATA 2TB game storage SSD with a 2TB NVME drive just to clean things up more. In my opinion, with how pricey the newer platforms are becoming being able to run multiple NVME M.2 drives directly on the board and having them cleanly out of the way and unseen should be the norm. Get rid of the clunky old SATA ports all together as till having to run cables out of the way is pointless. Some might try to debate saying they need the 2.5" drives and SATA ports but in actuality you don't if you have a good motherboard that can hide away 4 M.2 drives under its' heatsinks. Then you have more premium boards from MSI and ASUS that give you PCI cards for adding another 4 M.2 drives on top of that. So ya, the old SATA ports are rather pointless. As for the 4090, I'm seriously considering one this time around and have no problems with saving the funds for it. But I am still going to wait before hand as I would like to see what AMD's 7000 cards are going to be like. If they offer performance even remotely close to this I'll just stick with AMD as I always have. As for current offerings, I'm still going to purchase a 6900/6950XT when prices fall further just in case we have another stupid shortage again that seems to be the norm each year.
@amdkillaplays
@amdkillaplays Год назад
JaysTwoCents made a good point on not using the adapters and getting an ATX 3.0 power supply while you're at it, just to be safe. So you also have to factor that into the cosf
@martinjohnston1907
@martinjohnston1907 Год назад
Sometimes. Often, at least for me where the price/performance curve bends is what I shoot for.
@RetroBusker
@RetroBusker 2 месяца назад
How about editing the new 8k features on s24 ultra & the Sports cameras GoPro (13?) DJI ect... my 2070 super on editing software will crash at more than 3 mins of 8k30/4k120/5.3K 60... would be very interested if you made video on how it performs with them video powers compared to RTX 4090 AND RTX 2070 SUPER....If I upgraded everything will my 32INCH 4k monitor be fine editing 8k with the 4090, or will it crash and need a 8k monitor?
@75Krusty
@75Krusty Год назад
Got a few questions about RTX 4090 cards power consumption. And hope it's okay to ask about that. What are it's power consumption when just doing low task jobs like browsing in a browser? Or say just looking at the desktop.... Is it still using 450 Watts of power like when gaming? Or less?
@512TheWolf512
@512TheWolf512 Год назад
looks like a good card to buy in about 7 years time.
@zhanucong4614
@zhanucong4614 Год назад
No this gpu clearly was a joke
@512TheWolf512
@512TheWolf512 Год назад
@@zhanucong4614 after 7 years it should cost less than 400$, which would make it not a joke anymore
@zhanucong4614
@zhanucong4614 Год назад
@@512TheWolf512 it would be literally new GTX 980 and no longer produced
Далее
DON'T Game at 8K - RTX 4090 Demo
19:57
Просмотров 4,9 млн
I Bought a $5000 PC in a Random Asian Tech Mall
22:12
Best 1440p GPU | Top 5 Best 1440p GPU 2024
6:59
We Bought 6 Dead GPUs. Can We Fix Them?
27:01
Просмотров 4,4 млн
One kidney, please!  - NVIDIA RTX 4090
10:28
Просмотров 2 млн
How Bad is This $10,000 PC from 10 Years Ago??
22:00
Просмотров 2,8 млн
ПОКУПКА ТЕЛЕФОНА С АВИТО?🤭
1:00
Все розовые iPhone 💕
0:51
Просмотров 220 тыс.