Тёмный

VRAM - is 8GB or 12GB really enough to play Games in 2023? 

Tech YES City
Подписаться 588 тыс.
Просмотров 131 тыс.
50% 1

Gaming graphics cards (GPUs) can have anywhere from 24GB of VRAM (think RTX 4090 and RX 7900 XTX) to as low as 1GB of VRAM (GT710), though how much is enough? Is the 8GB on your RTX 3060 Ti and RTX 3070 going to cut in 2023? Well let's test out the most popular vram demanding titles in 2023 to find out.
Sponsor Get a Windows 10/11 Pro Key for under $15 (Use 30% Coupon BFTYC) - bit.ly/BFTYC
Windows11(21$):bit.ly/BFTYC11
Office2021(49$):bit.ly/BFTYC2021
Chapters
00:00 VRAM, Utilization vs Allocation.
02:54 Modern Warfare 2, 1080p low requirements all the way to 4K Ultra.
04:23 Hogwarts Legacy Vram Benchmarks.
06:04 The last of us part 1 the absolute chonker for vram consumption.
08:49 Spiderman Miles Morales, Ray tracing needs even more VRAM.
11:00 Conclusion, how much VRAM should your new GPU have? The bare minimum, vs low vs having some flexibility to game.
15:59 Question of the day, who will pay that much for a card with only 12GB of VRAM?
✅Shop Aliexpress WorldWide: s.click.aliexpress.com/e/_etUbxJ
✅Shop on Ebay Worldwide: rover.ebay.com/rover/1/711-53...
❤️Become a Tech YES City member and get access to perks
/ @techyescity
⭐Consider Subscribing here bit.ly/3G20vC1
💯Merch - www.redbubble.com/shop/techyes...
❤️Support Directly - / techyescity
💻Discord Access - / discord
-------------------------------------------------------------------------------------------------
DISCLOSURES: Generally all links tied to products are either Amazon, AliExpress or Ebay Affilaite links, this means that if you purchase a product we earn a small sales commission, which costs you nothing extra (if you end up purchasing a product). All sponsored content will contain the word "SPONSOR" if directly sponsored or "AD." Any additional revenue stream will be disclosed with similar disclosure.
Music Provided by either: epidemicsound, audio library or royaltyfreeplanet.
#VRAM #Gaming #PCGaming

Хобби

Опубликовано:

 

23 июл 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,2 тыс.   
@MarcoGPUtuber
@MarcoGPUtuber Год назад
I still run 4 GB of VRAM. It works so well. All my games run flawlessly on my 800x600 CRT!
@philscomputerlab
@philscomputerlab Год назад
4MB Voodoo is all you need ☺
@MarcoGPUtuber
@MarcoGPUtuber Год назад
@@philscomputerlab that's right! My voodoo 3 2000 should be all that's enough! Now time to play some DOOM!
@Po5itivemind5et
@Po5itivemind5et Год назад
@@MarcoGPUtuber loooool
@pf100andahalf
@pf100andahalf Год назад
640x480 is where it's at.
@ShinyHelmet
@ShinyHelmet Год назад
@@philscomputerlab Yeah, mine served me well playing the original Half Life at 640x480 on a 14 inch monitor. 🤩
@MovoSt
@MovoSt Год назад
A follow up test on 1080P Ultra and 1440P low/ ultra would be great.
@matthewIhorn
@matthewIhorn Год назад
Exactly!
@dgonsilver-fs5gf
@dgonsilver-fs5gf Год назад
100
@SomeSloan
@SomeSloan 9 месяцев назад
Would be great to see a vid on vram usage at these resolutions! Based on this videos results, I think it would be safe to assume that if you want to play games at high-ultra settings on 1080p without worry- 10-12gb vram should be a safe bet
@upfront2375
@upfront2375 6 месяцев назад
@@SomeSloan For today 8 is enough at 1080p. For yrs to come, you'd need to either down the textures OR upscale anyways to stay at 60fps.. both of which will greatly lower the vram usage. There're 2 reasons why 3060ti-4060 has only 8gb. 1.It's enough for the perf. power at lower res. high fps use case. 2.So ppl who need more go buy much more expensive cards. The 12gb on 3060 and 16 on 4060ti aren't useless overall for some creative scenarios, but for gaming?? sht it's gonna be exactly like 4gb 750ti.... like a big D on a catholic priest! lol
@DJ_Dopamine
@DJ_Dopamine Год назад
I game on a 1080p display. Not having issues with 8GB. Anyway, I'm always happy to turn things down a notch (or two) from Ultra if necessary. The visual difference is usually marginal.
@wireless1235
@wireless1235 Год назад
I think there needs to be a part 2, which includes the impact of specific settings to vram usage and also includes the use of lower tier cards.
@dreamcat4
@dreamcat4 Год назад
if there is a part 2, it would be nice if byan can bring up a table for which of the games being tested are re releases from the latest console generation (who have *almost* 16gbs of addressable vram, minus the game and the operating system.... = so are more like 14gbs max.) versus modern game releases which are either originally last generation console releases? or pc only releases? do such games even exist anymore? i mean ever since xbox division already purchased the entire aaaa* games industry pretty much (near enough) and how much addressable vram did the last gen consoles have? like the ps4... ah only 5.5gb. about same for xbox one. then xbox 1x is 12gb max. and ps5 is 16gb max. but minus the shared memory overheads.
@gozutheDJ
@gozutheDJ Год назад
bro, it's called, do it yourself.
@simsdas4
@simsdas4 Год назад
I second this, sure you can run high settings but expect to drop textures and shadows for example.
@SuperConker
@SuperConker Год назад
This is what I think nVidia should have done for Vram on the entire 4000-series: -4050/4050 ti 12 GB -4060/4060 ti 12 GB -4070/4070 ti 16 GB -4080 16 GB -4090 24 GB Basically not a single model with under 12GB of Vram.
@Terry1212
@Terry1212 Год назад
The 4070 and 4070 Ti have 12GB of VRAM
@SuperConker
@SuperConker Год назад
@@Terry1212 I know, i'm just saying that's what nVidia SHOULD have done with the Vram.
@SherLock55
@SherLock55 10 месяцев назад
What's the point of 12gb on a 4050 LMFAO, it's not even fast enough to run at higher resolutions and settings where it would be needed.
@SuperConker
@SuperConker 10 месяцев назад
@@SherLock55 The 4050 would perform about the same as the 3060, which has 12 GB of VRAM. The 3060 again, performs the same as the good old 1080 ti (with it's 11 GB of VRAM). There are already games out in 2023 that can eat through 12 GB of VRAM at 1080p (and 1080p is not even a high resolution). So to release new cards in 2023 with as little as 8 GB of VRAM is a joke. Starting the lower-end models at 12 GB is perfectly fine.
@SherLock55
@SherLock55 10 месяцев назад
@@SuperConker The only games eating 12gb of VRAM at 1080p are unoptimized pieces of trash not worth playing, don't get it twisted. Just because some devs are lazy or incompetent doesn't mean you actually need so much VRAM at such a low resolution.
@peterkeller7880
@peterkeller7880 Год назад
Its great to see you do this. This was needed to see. Not everyone games on 4k ultra. This helps people really make an informed decision when getting a new gpu especially when coming from older generation of gpu cards. Thank you for your hard work.
@thejollysloth5743
@thejollysloth5743 Год назад
I’m gonna grab a 16GB RX 6800 off EBay for £350. Since I don’t give a toss about RT it will last me at 1080p until the next console generation comes out. I don’t mind having to turn down a couple of settings like shadow quality, fog and other weather effects in 5 years as long as I get the top levels of anti aliasing, render distance, and texture quality. I’ve got a feeling a used RX 6800 16GB will be fine for that at 1080p for many years to come. I also think that a Ryzen 7 5800x will be more than good enough for 1080p ultra or high settings for 5 years or so, and they are so cheap now with a decent B550 board and 32GB of 3600 CL16 RAM. 1080p is fine for me. I only use a 24 inch screen so I don’t really notice the pixelation I would with a 27 inch or larger screen. A lot of my friends still play CSGO at 900p to get the most FPS they can and the lowest latency. And these are pro level players who have 360hz or higher monitors downscaled from 1080p to 900p. A I really can’t notice much of a difference between 1080p and 1440p, but that could just be me…or the fact I’m so used to that resolution.
@sc337
@sc337 Год назад
Hi Bryan, hope to see a follow up video by using a real 6GB, 8GB and 12GB cards on the exact same games. I bet the RAM utilization will be much different from the results of this video. Love your vids. Peace
@techyescity
@techyescity Год назад
No worries man, will definitely be doing that for you! This for me personally is a whole journey that I want to uncover and learn about. I will make this a whole series. However starting out with the two 'unlimited' vram cards is for me a base case to then inference further data against.
@naturesown4489
@naturesown4489 Год назад
There are a lot of channels and sources that have done those comparisons, they're very similar
@sc337
@sc337 Год назад
@@techyescity much appreciated Bryan. Keep up the good work! 👍👍
@laszlodajka5946
@laszlodajka5946 Год назад
Yeah. Im having a 10gb 3080 and the last of us warns me of it when i push all the settings up to ultra but still runs well. So the 10 gb may still fall into the ok zone for now. May be interesting to see on what settings u can get away with less vram.
@Peter.H.A.Petersen
@Peter.H.A.Petersen Год назад
​@@techyescity Also, I don't think even a 3070 ti could run 4K Ultra with Raytracing ON at proper frames, if it had 24GB and so isn't it irrelevant if it has enough vram to do it, if it can't do it anyways?
@masterkalel06
@masterkalel06 Год назад
Hey Brian. Any chance on the V Ram tests, can you do 1080 Ti and 2080 TI against similar performing 8 GB cards. Since you're all about the used price performance, I'm curious if the 11 Gigs make those cards perform better going forward.
@pkpnyt4711
@pkpnyt4711 Год назад
I think what were kind of missing with this test is we're using the top 2 high end cards from red and green. These cards have the highest bandwidth and throughput compared to the mid and lower end cards. These things might be fast enough to not having to load as much ram as they are fast enough to deal with it. How would ram usage look with a 4070ti and a 6950XT or even a more mid range offering? Im not so sure, but it's a legit question I have in mind.
@pf100andahalf
@pf100andahalf Год назад
Faster cards don't use less vram.
@SirJohnsonP
@SirJohnsonP Год назад
Yeah it can only get worse with those. But yeah Nvidia is now focusing on AI gpus, Musk ordered 10.000pcs for twitter, and OpenAI ordered more than 30.000pcs. So now they have a perfect excuse to lower the pc gpu production, and focus on AI gpu market... Too bad latex and leather fans are still buying nvidia gpus, giving them a reason more to produce 8gb 500$ gpus in 2023
@standarsh8056
@standarsh8056 Год назад
Not how it works. Faster memory = more frames, but if you lack the memory buffer in the first place it will still tank performance
@sc337
@sc337 Год назад
From what I observed, some games uses less VRAM on lower VRAM cards. For examplea same game with same settings, a 4GB card may show 3.5GB utilization while a 8GB card may show 4.5GB utilization. I think it makes more sense to test the games with a real 6GB, 8GB & 12GB cards. Anyway, still appreciate Bryan's effort
@pf100andahalf
@pf100andahalf Год назад
@@sc337 Vram will overflow into system ram. In your example of a lower vram card using less vram, it's using a hell of a lot more system ram.
@Beezzzzy_
@Beezzzzy_ Год назад
Any reason 1440p is left out? I think this will be a good data base to put together. Nobody really discusses VRAM, the 1080ti is still a solid card cause of its 11GB of VRAM being released 6 years ago, and we're still having cards come out with 8GB or less, 12GB should be the baseline sold in 2023, not 6GB-8GB anymore.
@HxR-eSports
@HxR-eSports 11 месяцев назад
Yea there a reason. It would have shown results that went against his agenda.
@edeka3
@edeka3 11 месяцев назад
​@@HxR-eSportsdo you think 8gb is enough to run at 1440p or 1600p? A little future proof?
@sebastienhebert6457
@sebastienhebert6457 Год назад
New to your channel and I love it. You it the sweet spot pragmatic technical useful information. Thanks for that last part on 1080p high settings.
@XLoad3d
@XLoad3d Год назад
Hey Tech Yes I have a question. I was thinking about swapping out my current 1070 gtx GPU out for a 1080ti. With that it's just plug and play right? anything higher like a 2070 gtx won't be compatible with the chipset right (I have a i7-7700 CPU @ 3.60GHz)
@maxdema115
@maxdema115 Год назад
Too few games tested (and one very broken like TLOU) to have a reliable analysis. And would have been great to see 1440P results, since the RTX 4070 has been designed for that target.
@TheSakrasta
@TheSakrasta Год назад
I did not expect the impact of resolution on vram usage to be that small compared to the quality settings. It might be very interesting to have a bunch of tables for some of the newest titles, which show vram usage at 1080/1440/4k + low/medium/high/ultra settings. Because a lot of games look very similar at high instead of ultra. So with such a table you could find the sweetspot settings for your personal vram amount. If dropping from 1440 to 1080 only saves you 1GB of vram, but going from ultra to high saves you 2GBs, you would most likely have a better looking game at 1440 high compared to 1080 ultra, while also using less vram.
@angrydragonslayer
@angrydragonslayer Год назад
The textures are still the same quality, you just use less of it on the screen
@bigturkey1
@bigturkey1 Год назад
just use dlss. i never have to turn down settings. i just changed dlss settings.
@dreamcat4
@dreamcat4 Год назад
yeah i agree somebody out there, if they are doing tables include its primary platform it was targeted for. whether a console, then what usable vram that console had. or if it was a pc only game. because clearly that is useful information to include in such tables... since that is the common underlying reason for these escalating vram requirements. each new console generation [edit] and lets hope sony ps6 will not come with greater than >24gbs of gddr.... or we will all be in trouble! ha
@philscomputerlab
@philscomputerlab Год назад
For Windows XP Retro Gaming, less is more. Some games have issues with large VRAM, best to have 1 GB (GT 710 FTW) 😅
@Rabbit_AF
@Rabbit_AF Год назад
What if video card companies made cards that shared system memory again. I was a bit thrown off when a S3 Graphics card I was testing was doing this. ATI had a feature like this called Hyper memory.
@ShinyHelmet
@ShinyHelmet Год назад
I've still got a 256mb 7600 GT for all that retro malarky! 🥰
@devilzuser0050
@devilzuser0050 Год назад
I selld a gtx titan cause heroes2 & nfs2se doesn't start on it under XP. (6gb vram)
@necuz
@necuz Год назад
@@Rabbit_AF That's exactly what the Windows Video Memory Manager is doing, that's why games typically only run really poorly instead of crashing when you run out of VRAM.
@bryanwages3518
@bryanwages3518 Год назад
​@Rabbit _AF amd vega cards can do this. It's called hbcc. You can expand your vram with your system ram.
@Mythpathz
@Mythpathz Год назад
Sir I have questions do you think triple a game's like Star Wars Survivor and Dead Island 2 which minimum specs start with i7 and 1070ti can work for i5 and 1050ti specs or not ?
@timberwear369
@timberwear369 Год назад
I definitely would like you to include 1440p High Settings. You only tested the two extremes, 1080p vs 4K and Low vs Ultra. 1440p High for me makes much more sense. But maybe with highest texture settings.
@MsNyara
@MsNyara Год назад
1440p Ultra aiming for high frame rate tends to be the same as 4k HD High/Ultra 60FPS discussion.
@Kapono5150
@Kapono5150 Год назад
So happy to see Nvidia users stand up for themselves on the 4070. Even fake frames doesn’t get them to open the purse.
@LeJimster
@LeJimster Год назад
Honestly, the frame generation and even upscaling tech feel scammy to me. Especially when they're advertising it in their benchmarks. I much prefer running my games at native resolution.
@Verpal
@Verpal Год назад
@@LeegallyBliindLOL TBH I do know some people prefer the native with jaggies, but I felt most people who claim upscaling is a scam is simply saying that because FSR 2 isn't remotely competitive for now, they don't hate the tech, they just hate NVIDIA.
@LeJimster
@LeJimster Год назад
@@LeegallyBliindLOL Well I can't use DLSS because I'm on AMD. But I'm pretty sure both DLSS and FSR have weird ghosting issues in movement and also strange artifacts. I only use FSR for performance reasons and at the resolutions I'm using it I notice a big visual fidelity drop (edited, because of brain performance failure). DLSS may be better, but I still think it's getting to the point where they aren't producing faster GPU's but faking performance through these techs and artificially locking the software to newer cards. The only way I would like to use this tech would be in reverse, since taking higher resolution and downscaling it produces a crisper image.
@mr.obeydoge5266
@mr.obeydoge5266 Год назад
Only one thing I can say. These companies dont care about us and I dont know why bother defending but I digress. Here is the main thing people, fake frames is fake frames. Native is definitely the way to go since it truly measures the raw capability of the component. There I said it. Dont allow them to control the market for so long and support their bad habit of setting insane value on products that are supposed to be produced in reasonable prices.
@LeegallyBliindLOL
@LeegallyBliindLOL Год назад
@@LeJimster so, you in reality, don't have a real world reference point. From my experience, even at 4K, FSR is noticeably worse (usually blurry) and I don't notice any artifacts with DLSS unless I use performance mode in some titles. I modded in DLSS for RE4 and it was a night and day difference. You can believe me or not. But in the end, RU-vid doesn't convey the differences well enough.
@altun8310
@altun8310 Год назад
Hi from Canada. Thank you for the video. Timely analysis and I'll keep an eye out for your future ones on this topic. My suggestion is that you should add 1440p high settings as benchmark. That represents the upgrade path for the majority of people who still play at 1080p. Also, I also noted in youtube videos the ram usage/allocation between nvidia and amd. That would be interesting if you could investigate and explain!
@bradb2012
@bradb2012 Год назад
Suggestion for a part 2: What impact on visuals does going from, low texture detail to ultra have? Is it better to have 4k low texture detail, or 1080p ultra texture detail... Where is the happy medium?
@Thezuule1
@Thezuule1 Год назад
I use my GPU for VR and even with the relatively low memory requirements of most VR games, you still need to render above 4K and even 12gb is likely not enough now, and certainly won't be in a few years.
@gozutheDJ
@gozutheDJ Год назад
Vr is its own thing.
@darkkillex7220
@darkkillex7220 Год назад
Same here, I got a 3080 thinking I would be able to easily run VR with it. Except I bought it back when they still had only 10GB of VRAM and as soon as I try to run any game that's not a dedicated VR game in VR I'm just limited by the VRAM
@anthonylong5870
@anthonylong5870 Год назад
Bro Vr is at best 1080 lol , most is only 720 per eye....Your not rendering 4K
@Bos_Meong
@Bos_Meong Год назад
@@anthonylong5870 Actually its very close to 4k. quest 2 has resolution totaling 3664x1920 vs 4k 3840 x 2160. despite all of this hl alyx only consume 7gb vram all setting maxed out. Its all about how optimized the game, we need to stop supporting shitty port games
@Thezuule1
@Thezuule1 Год назад
@@anthonylong5870 each eye is more than 1080p dude..
@bctoy2779
@bctoy2779 Год назад
DLSS3 Frame Generation also requires more VRAM. So with 4070Ti, you can already run into a situation where the card can do 4k60 or better but runs out of VRAM and stutters.
@JustGaming24
@JustGaming24 Год назад
its not a 4k card tho
@brunoutechkaheeros1182
@brunoutechkaheeros1182 Год назад
@@JustGaming24 so why the hell people say 4070 ti beats 3090? wasnt 3090 a 4K card? lmao
@JustGaming24
@JustGaming24 Год назад
@@brunoutechkaheeros1182 more or less same performance is not considered a 4k gpu because of the 12gb vram the 3090 has double the amount.
@vargasmongo3435
@vargasmongo3435 9 месяцев назад
sorry I saw thw whole video, are you using an Audio Technica AT2020? it sounds really good
@puddingfoot
@puddingfoot Год назад
Hey! Love your content. Ive also been living in Japan for the last 17 years or so and sometimes check out janpara for deals, and totally agree with your stance on buying used parts for viable builds for 95% of games. However, with VRAM requirements being 12gb for high settings in new AAA titles, I am torn between buying a few cards: rx6800 ' ~48,000 yen, used rx6700t- ~38,000 yen, used rtx 3070* ~43,000 yen, used *Im only considering the 3070 for the video AI upscaling feature in VLC/chrome/edge (Video Super Resolution). What a killer feature! Does AMD offer similar features or is it in the works? I'd go with AMD in a heartbeat if so. Nvidia recommends the 3070 for the highest level (setting level 4) of Video Super Resolution. However, some 3060ti users report using VSR at level 4 without issues. Do you have an opinion on this? Maybe the technology is too young. hope your allergies arent so bad now. This year has sucked for cedar allergies in Japan but we are in the home stretch to golden week and less pollen in the air! ganbare🤘
@mattfarrar5472
@mattfarrar5472 Год назад
Would have been good to see you run a 6 or 8gb card in those titles to see what it would do on different settings...
@Art_Vandelay_Industries
@Art_Vandelay_Industries Год назад
What's crazy to me is that the graphic fidelity doesn't actually looks that good, considering the requirements. I think optimization should be more of a focus nowadays. That would also help with the insane prices for hardware atm.
@ShinyHelmet
@ShinyHelmet Год назад
The thinking seems to be that they develop for the hardware available on the new consoles and then just port it to PC.....and hopefully patch it later!
@sven957
@sven957 Год назад
They optimize for consoles which have 16GB combined memory. Yes, they COULD optimize it better but that costs a lot of dev time and money which they would rather invest into other parts of the game which makes total sense, considering consoles make up most of their revenue. Although there are titles like TLOU which are actually REALLY badly optimized. But other than that the only party to blame here is nvidia who decided to build planned obsolescence into their cards.
@grlmgor
@grlmgor Год назад
@@sven957 Well if they don't optimize, then don't buy their game.
@sven957
@sven957 Год назад
@@grlmgor Sure if you dont want to play uh - pretty much all of the major upcoming titles. Again you cant blame the devs in most cases (you can in TLOU) - 8 GB was first seen on a GPU in 2015. Nvidia did this to make you upgrade when their next overpriced generation drops. 12GB right now is barely enough to max out games (if I'm paying that much for a fucking GPU you better let me max those settings!) just like how 8GB two years ago was barely enough. The 40 series cards will run into the same issues in max 2 years.
@peterpan408
@peterpan408 Год назад
For 1080P there is certainly a fidelity limit set by the pixels, that could be optimized for in the engine.
@amidk75
@amidk75 Год назад
how to set vram utilisation in afterburner? didn't seen any variables for that...
@BigBadZig1980
@BigBadZig1980 Год назад
so would be the 2080super 8gb or the 1080ti 11gb the better choice (have both)? what are your thoughts on that? the slightly more powerful 2080s, which runs earlier in the vram cap or vice versa with the 1080ti? or, as we are here at the most in 1440p territory, does it not matter?
@masterkalel06
@masterkalel06 Год назад
That's what I'm wondering too. I picked a 1080 ti a few months ago over the 2080s based solely on the VRam, and this was before all the VRam controversy.
@projectc1rca048
@projectc1rca048 Год назад
LOL! When you said VRAM - Magedon I literally laughed out loud, love it. Only @Tech Yes City man. I imagine with all these latest and greatest AAA titles raising the minimum pc requirements to run their games, especially the games that will be using Unreal Engine 5, 12gb of vram will be the new minimum/standard. Of course it will depend on the resolution and settings people play at. Great topic for a video and appreciate all the hard work my guy. Keep up the great Tech Yes City content.
@HairyScrambler
@HairyScrambler Год назад
I think for the vast majority of games worth playing 4 GB will hold up for the near future, as long as devs can optimize their games. It’d a shame the 1060 3 GB is already starting to become unplayable 1080p in games like 2042.
@YoStu242
@YoStu242 Год назад
The same formula seems to apply in software development as in life in general, that if there is space in the apartment, it is lazily allowed to fill up with junk and you don't bother to clean it, let alone think about whether you even need all that junk
@Thehughesgaming
@Thehughesgaming Год назад
How much of a difference does 1440p make compared to 4k? Since I’m looking for 1440p ultra with Ray tracing
@telekarma
@telekarma Год назад
Devs can use more hardware resources with new/current gen only titles and this is what we get. From what I've seen 4k ultra and 1080p ultra VRAM usage delta isn't too big, about 1-2GB difference. That doesn't bode well for lower VRAM cards even if they are otherwise fast enough.
@Hostile2430
@Hostile2430 Год назад
I upgraded my GPU to 1660Super 6GB only recently and i already feel outdated trying to run some current games at high settings i exceed or consume most of my Vram and suffer from stuttering. Crazy to think just a few years ago 8GB vram was considered overkill and now its becoming bare minimum requirements to run most modern AAA titles at acceptable framerates.
@zicksee0
@zicksee0 6 месяцев назад
dawg 1660 super isn’t meant to run games at high settings lol.
@MrMengAmok
@MrMengAmok Год назад
I have a question. I have an 5800x3d, an 2x 16gb ddr4 3600 cl 18 ram. Will this speeds run with 4 og this sticks and is there any inptovement on the vcache ryzen chips? Does anyone know this or better have tryed it out?
@wayland7150
@wayland7150 Год назад
RX VEGA has 8GB but there is an option in the driver to add 4GB of system RAM to the VRAM. Battlefield 2042 seemed to expand itself into the now 12GB VRAM but I can't say if it really helped.
@SlowHardware
@SlowHardware Год назад
Brian can you do a video on the radeon vii vs the 2080 in 2023? I'm curious how it stacks up now with it having 16gb vram
@mateyv
@mateyv Год назад
or 2080 ti vs 3070
@grlmgor
@grlmgor Год назад
A 2080 only has 8GB
@Kryptic1046
@Kryptic1046 Год назад
@@mateyv - I've seen at least one channel (I don't remember which) comparing the 2080ti vs the 3070 in some recent titles and the 3070 was getting trounced in some of the tests by the 2080ti due to the 3070's VRAM limitation. The 3070 was as fast as the 2080ti, until it wasn't, due to the 8GB of VRAM.
@bryanwages3518
@bryanwages3518 Год назад
I have a radeon vii and my cousin has a 2080. In most new games I can play at higher settings than he can and I don't have the frame drops like he does. We play a ton of warzone and his frame times are awful.
@SlowHardware
@SlowHardware Год назад
@bryanwages3518 thanks for the info, looks like I made a good choice getting one :)
@IamMarkSmith
@IamMarkSmith Год назад
In my opinion, Nvidia is using tech like DLSS to be able to pinch on their physical hardware specs to not only charge more relative to the mindshare they have in the GPU market, but increase their profit margins all around. AMD is the closest they have ever been as competition to Nvidia with their current crop of RDNA 3 cards. If they can get the price to performance right on the forthcoming 7800 and 7700 models we will see them make inroads into Nvidia’s market share. We all win when there’s competition in the marketplace. I’m not a fanboy of either company, but I am a fanboy of better value for my money.
@leandroperalta
@leandroperalta Год назад
Have you tried Atlas OS (Windows de-bloater)? Keen to know your thoughts.
@Chris_94
@Chris_94 Год назад
With 6gb of vram being the minimum for how much longer will the 3060 mobile be useable for at 1080p and 1440p with ray tracing off and dlss on?
@YouOnlyLiveOnce...
@YouOnlyLiveOnce... Год назад
Good data. Please include 1440p settings next time.
@f.ferenc88
@f.ferenc88 Год назад
1080p then 4k? Where the fucks 1440p ??? That is todays golden standard....
@Mr360alan
@Mr360alan Год назад
So what is going to be the vram in 1080p max cuz I don't have a 4k monitor
@nine0ten771
@nine0ten771 Год назад
Hey Brian, do you think the 4090 uses less vram because it is faster than the 7900?
@kasmidjan
@kasmidjan Год назад
Ngreedia can Pay their investors with Cheap Leather jackets If they keep stingy with VRAM
@Rizzlas
@Rizzlas Год назад
I have a 3070 8G , and I can confirm with Hogwarts Legacy in 1080p Ultra settings , I have some trouble having a smooth gameplay all the time (I have to use some optimization tool to do it) , but my friend have a 3060 with 12Gb of VRAM , he runs it in high-ultra settings got a lot smoother experience than me. Pretty sad to be honest :/ I'm thinking about maybe sell my 3070 to get a amd card of the same grade with more VRAM :)
@95928225
@95928225 Год назад
I sold my 3070 for 400$ and bought a new 6700xt and it is wayy better. No vram crashes like in re4, deathloop, forza horizon, howgarts, last of us
@Rizzlas
@Rizzlas Год назад
@@95928225 yeah but i'm doing a lot of render on adobe premiere and loosing cuda acceleration is not possible
@simon89oi
@simon89oi 8 месяцев назад
​​@@Rizzlas 2080ti should fit your needs than
@hrayz
@hrayz Год назад
How do you get VRAM Utilization, vs Allocation, to show?
@stevenwex8966
@stevenwex8966 Год назад
I have a 32 inch 720p TV and thought it might be time for 4K upgrade until I saw the price. My TV does 768p for PC, should I game at 1080p on a 768p tv? If I got a 4k I would aim for 1440p, couldn't buy a 1440p or a tv that wasn't a smart or google TV?
@DJ_Dopamine
@DJ_Dopamine Год назад
I used to game on a 32" 720p 60Hz TV myself. It would accept a native 1080p input/signal, but I decided that (apart from watching movies via Blu-ray) 720p resolution was looking about the same (as after all, it's the native resolution). Unfortunately native 1440p TVs don't exist. When I eventually went that route resolution-wise, I had to get a PC monitor. Though I can tell you that 1440p looks good on a 48" display (TV). Even though it isn't integer scaling. But make sure any TV accepts a 1440p input/signal natively, otherwise you'll need to output at 4K from the PC with "GPU Scaling" enabled, to get it rendering games at 1440p. Works fine for most games, but not every single one. Many modern games have resolution scalers built in which avoids such hassle.
@buda3d2007
@buda3d2007 Год назад
I use Blender where Vram is king on larger scene files, once you run out of vram your card might as well be the equivalent of a great sports car spinning its tyres working 10 times as hard to get the job done when it would only need to do it once had it had more vram.
@furynotes
@furynotes Год назад
Even with character portraits 12gb is recommended.
@jqwright28
@jqwright28 Год назад
I'd say you're right about games being designed for 4k and then scaled down. Also games like TLOU remake that are next gen ps5 only titles, probably also are designed for 16gb of unified system memory or whatever it's called, so they probably will run best on anything that can give them 12-13gb on the gpu.
@gozutheDJ
@gozutheDJ Год назад
ALL games have been this way for a while. doom 3 had a level for the maximum quality, uncompressed textures and then all the lower quality settings were scaled down from there. that's why games don't look like literal mud on low settings these days.
@geraldsalas3213
@geraldsalas3213 4 месяца назад
what about gaming and streaming with entry level 1440p like the RTX 3060ti and 3070 w/ 8gb only. will they run out of vram and crash if you game and stream in OBS?
@geraldsalas3213
@geraldsalas3213 4 месяца назад
if im just gonna play 1080 on those 1440p entry cards is 3060 12gb a better option w/ 12gb vrams?
@lilsus3125
@lilsus3125 Год назад
very helpful video thank u god bless
@bledboost
@bledboost Год назад
There is still one way to play all the latest games even with 4GB of VRAM. Just play at 720P! In many cases you will get a better experience playing at 720P High than 1080P Low. This is especially true on gaming laptops since the screen is smaller so the difference in resolution is less noticeable.
@roadrash2005
@roadrash2005 Год назад
I have a 4K tv, I can’t go backwards lol I tried it was painful
@sololoquy3783
@sololoquy3783 Год назад
but you effectively gimped your card at that point... so yay nvidia!
@bledboost
@bledboost Год назад
​@@InnerFury666 Well you wouldn't need to play pixelart games or even the average game at 720P because they don't have high VRAM requirements. I'm talking about the latest high end games that choke when you don't have enough VRAM. I'm just saying 720P is still an option to make those games very playable.
@Cogglesz
@Cogglesz Год назад
I'm rocking 8gb on my 3060ti, performance is fantastic for the price. Noticed my only cap seems to be 4K, Doom Eternal with ultra Nightmare texture pooling (everything else can be 1440p Ultra Nightmare), Forza 5 Eats it all up despite being able to run 120 v-synced. Game would pause and complain of low bandwidth (Much more than the Series X funny enough) I've wanted to just play at 60 with higher geometry to match the X. Turns out i''ll always bump into this issue, It's annoying my 64gb of ram is basically doing nothing. (Last of us managed to get 21GB of usaged though so gg's) Honestly i think 12gb is the new 8gb card. Midranged Vram amount seems to be inflating like our currencies. I feel some blame on porting has to be stated. RE4 medium textures are worse than PS4 somehow and it'l eat up a 8gb card happily. It's kinda sad when you've a lot of Resources but what holds you back is 8gb of Vram? Call me a boomer but i always felt 8 would be perfect for gaming, In the past we only really seen 12+ in professional cards only a few years back.
@Toulkun
@Toulkun 10 месяцев назад
It comes down to trash optimizations too
@Tomothing
@Tomothing Год назад
I tried to SCD keys with you code, the site gladly accepted my money but when it came time to use the code the code I couldn't access it, the website just keeps redirecting to the home page. It's been a few days, I was able to access the code once but I didn't use it then. I thought you'd want to know the site doesn't seem to work at the moment.
@cajuudoido
@cajuudoido Год назад
The important thing is if the share memory + dedicated usage surpass the amount of vram of your card you very likely to have issues. Some games uses a lot of shared memory, ie ram, even not fully utilizing all available vram. So is not only about checking if it using a low amount of shared memory, to determine if you're out vram, but the sum of shared + dedicated.
@prosecanlik4296
@prosecanlik4296 Год назад
This is ONLY for those singleplayer titles, where you only aim for 60 fps at high or ultra at 1080p or higher. I usually don't play those games, only multiplayer shooters, esports like CSGO, so for me, 8gb vram would do just fine. Planning to get RX 6600 for that matter
@danielkowalski7527
@danielkowalski7527 Год назад
rx6600 undervolted eats only 80w ^^ Idk why but colours are way better on rx6600 than my old 1650
@prosecanlik4296
@prosecanlik4296 Год назад
@@danielkowalski7527 will try to undervolt it if I get it one day
@ChiekoGamers
@ChiekoGamers Год назад
I'm still enjoying video games at 1080p high settings. I don't see the point of Ultra graphics.
@relucentsandman6447
@relucentsandman6447 Год назад
Thank you for talking about VRAM and im also interested in the difference between allocated and utilized, and how being low on VRAM effects the card
@HardwareAccent
@HardwareAccent Год назад
Can you test Resident Evil 4R at max settings, please? Does it really use over 16GB?
@RobertJianu
@RobertJianu Год назад
I still run 4GB VRAM, both on my gtx 1050ti PC and my rtx 3050ti laptop. The 3050 has decent performance (like a watercooled and overclocked gtx 1070 desktop that a friend of mine has) but the lack of vram is starting to show. At least I don't game that much anymore. My next gpu will probably be from AMD tho
@Verpal
@Verpal Год назад
Ampere desktop is suffering from botherline insufficient VRAM already, and yet for some reason NVIDIA decided to squeeze the low end ampere laptop even harder, people who buy low end stuff need it to last longer, yet NVIDIA decided to screw them in particular.
@sergeleon1163
@sergeleon1163 Год назад
Yeah I was on GTX 1050ti and the 4GB really started to limit me, I upgraded this week for €250 to a RTX 3070 8GB and even when on specific games like here shown could be limited I will drop down settings as I'm aware of this 8GB can be hampering (in the future), while many games it will still be okay. But when on a budget both NVIDIA and AMD are playing gamers for too high prices and forgetting about people on a budget.
@Killersnake432
@Killersnake432 Год назад
I upgraded last month from a 1050ti to a RTX 3060. I was GPU and vram limited now I can play stuff I used to play far better and have the vram space for games like RE4 Remake that I can do crazy high settings with. I would had gone for 3060ti but that 8GB vram buffer turned me away from it.
@TechHarmonic
@TechHarmonic Год назад
I remember briefly having a 3050 ti laptop and it got on my nerves fast. Even with older games, maxing them out at higher resolutions I would get horrible frame drops because of the vram running out. I returned the Legion s7 since it was $1k and it felt super overpriced for that performance.
@RobertJianu
@RobertJianu Год назад
​​@@TechHarmonic damn, I know how it feels. The 3050ti is decent for 1080p even with most recent games. You can't go higher than 1080p or bump the graphics too high because the vram will make your experience horrible. I kept it because I needed portability and the good part is that I got it for around 500$ at the time and it has a Ryzen 7 5800H, 16gb ram, 512gb nvme ssd and a 10 bit 165hz display. It's pretty good for my not so demanding games (FH5, RDR 2, God of War, Sons of the forest etc) and media creation (mostly editing in Photoshop since the screen has excelent colors, Sony Vegas and making documents) but I wouldn't recommend this GPU for a true gamer, 4gb vram is just unacceptable. Always aim for at least a xx60 series card since they age pretty good or just go with AMD (lower prices and higher vram than nvidia)
@stratuvarious8547
@stratuvarious8547 Год назад
I know I didn't spend hundreds on a new GPU to play at low settings, which is why I bought a RX 6900 XT. Reasonable price, 16 GB of VRam, it was the best choice in my price range.
@javier-xe6bv
@javier-xe6bv Год назад
Great video as always my friend.
@SonGoku-97
@SonGoku-97 Год назад
Took a break from tech for a while Man is it good to see your face again
@darkkillex7220
@darkkillex7220 Год назад
I've definitely noticed the VRAM on my 3080 10G being a limiting factor in quite a few games recently...
@greenbow7888
@greenbow7888 Год назад
The 3080 10GB card could not even run Far Cry 6 HD textures, within a month of the 3080 release.
@thomassmith9362
@thomassmith9362 Год назад
Well it is now almost 3 years old, you shouldn’t be expecting to max out games on it. I’m going on along nicely with that card, 1440p@high on the last of us works great.
@Bos_Meong
@Bos_Meong Год назад
try to run msi afterburner and see if your vram is actually eating up or not, dont just "noticed" lmao
@kaythree8302
@kaythree8302 Год назад
@@Bos_Meong any decently competent person would assume that’s what he meant by “noticed”.
@Bos_Meong
@Bos_Meong Год назад
@@kaythree8302 decently competent? Thats my line for you. I bet 100% was just placebo, he was just assuming and not really tested it out himself. Because me rn running cyberpunk at overdrive and it only eat 9Gb of ram hows this a limiting factor? Also 3080 cant do overdrive anyway so at ultra its gonna consume far less vram, probably 7gb. Maybe he was playing trash of us, which is a badly optimized game
@HyperBawl
@HyperBawl Год назад
Amazing content as always ! I'm sure my 6700xt will last loooooong
@rusty-gaming1988
@rusty-gaming1988 Год назад
I have rtx3070ti 8gb with i5 13600k will it age for 3 or 4 years ?
@buggiesindustries7550
@buggiesindustries7550 Год назад
Definitely an interesting video, thanks for making it!
@barrysloas277
@barrysloas277 Год назад
1 % of gamers are playing at 4k. Where are the 1440 stats that more people are gaming at
@Willbme4EVA
@Willbme4EVA Год назад
The gamer side of me really does not want to see grass swaying in the wind, I want to look at my opponents from a distance, before they see me. Take that shot and move on. The only time I would like to see a shadow is when an opponent is on the roof and casts a shadow on the ground outside. Higher HZ, not the p's or K's is my thing.
@WTBMrGrey
@WTBMrGrey Год назад
Nvidia is charging top dollar for their products, pushing DLS,RTX,A.I,reflex etc but skimping on Vram. The RX 470 came with 8gb vram and how old is that now?
@Kryptic1046
@Kryptic1046 Год назад
It's a pretty counterintuitive thing Nvidia is pushing. On the one hand, they really want to sell you resource-intensive features like raytracing/path tracing but then they don't want to give you enough VRAM in the mid-range to actually use it along with decent textures. DLSS can only do so much. In the near future, you'll probably have to choose between either having high textures with RT off or lower textures with RT on. You simply won't get to do both due to VRAM constraints in newer games.
@NostalgicMem0ries
@NostalgicMem0ries Год назад
wanna compare 3060 3070 performance vs rx470?
@WTBMrGrey
@WTBMrGrey Год назад
@@NostalgicMem0ries well obviously the 3060/3070 are a lot more powerfull. There is the point though. The 3070 is a decent 1440 GPU, but it only has 8gb vram which is pathetic. Even the 3060 has 12gb.
@mikeymaiku
@mikeymaiku Год назад
@@WTBMrGrey i guess you dont understand "why" it had 12gb
@mr.ihabissa8442
@mr.ihabissa8442 Год назад
​@@WTBMrGrey 3060 has 12GB Vram beacuse of it's 128bit bus ,they can do either 6 or 12 not 8 .. Regardless the 3060 is a shit card even on 1080p vs 3060ti/ 3070.
@RFLCPTR
@RFLCPTR Год назад
VRAM usage and how much is reserved by the game adjusts to the amount of VRAM present on your GPU. You would notice that when testing with an actual 4 GB VRAM card, instead of using a 24 GB VRAM card...
@faxxywaxxy
@faxxywaxxy 10 месяцев назад
my acer nitro sense 5 has 12gb of ram but star field is 16 ram ... can i atleast play it ?
@destrike702
@destrike702 Год назад
Currently have a RX 6600 8gb, can run the games on mid to high and quite happy on the performance.
@SlowHardware
@SlowHardware Год назад
I just sold my 6600 xt and bought a radeon vii, similar performance just 16gb vram. I did it because I got the radeon vii for $250 nzd 😅 sold the 6600 xt for $350 nzd
@destrike702
@destrike702 Год назад
@@SlowHardware I hope i can find that on the same price, in my country, even the 2nd hand gpu market is overpriced.
@SlowHardware
@SlowHardware Год назад
@destrike702 oh for sure it's usually way over priced for what it is. I just started bidding and got a good deal :) I'd just save a search on a couple sites and check occasionally and one may show up cheap :)
@evilleader1991
@evilleader1991 Год назад
@@SlowHardware What about power draw
@SlowHardware
@SlowHardware Год назад
@@evilleader1991 I have a 1000w psu it's fine
@fVNzO
@fVNzO Год назад
I think it's important to note that these figures are indicative of what game developers *tolerate* in order to make their games work on popular graphics cards. Had the average been higher (Nvidia spending 20 bucks more on their GPU's etc.) games would indeed look better, grander or load quicker - and more vram would be needed. The mere second the average VRAM count goes up to 16+ game devs will just eat it up as they can finally make more intricate game environments. So, as data points these are fun benchmarks to run but they are ultimately a product of the limitations involved with producing software for customers who have been well frankly scammed for the past 6 years with no discernible increase in graphics VRAM in the mid range.
@TheAkashicTraveller
@TheAkashicTraveller Год назад
Except's not what we're seeing. They're clearly targeting the consoles and ngreedia just isn't keeping up.
@soniofficial6017
@soniofficial6017 Год назад
​@@TheAkashicTraveller 😂😂😂
@arenzricodexd4409
@arenzricodexd4409 Год назад
@@TheAkashicTraveller the issue on PC is those ultra (plus RT) what makes people think 8GB and 12GB no longer enough because console have 16GB. but console most often only use what equals to medium on PC. in reality developer on console most likely did not use most of the memory on VRAM like many people think it was. game like returnal for example the developer want 120FPS. so they actually render the game at 1080p and then using their own upscaled tech to upscaled the final image result to 4k. game like spiderman the ray tracing reflection are rendered at 1080p instead of 4k. and in some comparison the reflection on water puddle are being completely disabled on console.
@fVNzO
@fVNzO Год назад
@@TheAkashicTraveller This is exactly what we are seeing. Consoles are proving my arguments. The PS5 spends 16B of shared memory with a basically infinitely large cache behind it right - it's the cheapest way to just give developers more VRAM. And they've been completely hamstrung on desktop since nvidia just decided to stop at pascal. This video is showing you that usage is around 12GB max for a lot of games, that is telling you precisely that developers are completely stuck and they have little ability to give us better games when there is no standard way to cache more data like on consoles.
@LeegallyBliindLOL
@LeegallyBliindLOL Год назад
What is up with all these comments claiming Apples to apples comparisons with consoles. A) the PS5 for example uses GDDR6. B) Quite a bit of it is reserved.
@miksedk
@miksedk Год назад
Have you tried playing around with the anisotropic settings in 1080p? sometimes the games actually plays better with the higher settings in anisotropic settings. And also playing around with FXAA settings
@jimbodee4043
@jimbodee4043 Год назад
I was thinking the 1080p might be triple buffering whilst the 4k may be only double buffering thus accounting for the smaller difference in vram usage.
@MinosML
@MinosML Год назад
Bryan yet again listening to what the community is preoccupied with and giving us a banger video with tons of useful info! Hope to see more vids on this subject so people buying GPUs at this point in time are aware of the compromises they'll have to make with lower amounts of VRAM. Devs are definitely focusing more and more on the current Consoles/4K and it shows. Thank again for the quality content!
@Willbme4EVA
@Willbme4EVA Год назад
Totally agree, more, more, more. Just one vid can not hit the full spectrum. But he does try to jam allot in this vid. Tech Yes City Series pls
@nukedathlonman
@nukedathlonman Год назад
I was thinking most games would be optimized for 2K these days... Now I know it's only a snap shot, and the accuracy has been called into question numerous times, but Steam's hardware survey does indicate 1080 is the most common resolution and it's on a very slow decline, with the next large chuck being 1440 and growing very strong.
@nukedathlonman
@nukedathlonman Год назад
@El Cactuar No, 2K is 2560x1440
@nukedathlonman
@nukedathlonman Год назад
@El Cactuar 1080 is HD (or as some companies call it "FHD")
@nukedathlonman
@nukedathlonman Год назад
@El Cactuar Oh, you're going by cinema resolution for the 2K labeling. Monitor manufacturer's will use QHD or 2K to describe 2560x1440.
@nukedathlonman
@nukedathlonman Год назад
@El Cactuar No, that's HD... Or FHD if you're going by manufacturers since they insist on calling 720 "HD"
@SolelyAndriy
@SolelyAndriy Год назад
Interesting results 👍 I guess it would be useful to do those tests for 2K as well. The reason is that DLSS is quite good at upscaling and antialiasing from 2K to 4K so there is a sense to enable it, especially with a high refresh rate monitor.
@paleinho
@paleinho Год назад
the 4090 in Hogwarts 4K RT ON downscaling textures in the distance? @4:50
@trr4gfreddrtgf
@trr4gfreddrtgf Год назад
I don't think 12gbs is going to last long at all, I wouldn't be surprised if it runs out in 2 years or so. I think 16gbs is a much safer bet, most people want their GPUs to last 4-5 years and 16gbs should do that just fine.
@paranikumarlpk
@paranikumarlpk Год назад
Ye 16gb for 1440p and 20gb for 4k is fine for 2 to 3yrs ... I really hate the 308010g for these latest games .. it sucks even for 1440p but still ppl argue 8gb is enough for 1440p for 5more yrs lol and they mindlessly support the greedy nvidia.. they don't understand the quality of high Textures and its impact of vram .. how can they expect the top quality visuals to run on the potato gpus with 12gb or below vram
@bigturkey1
@bigturkey1 Год назад
12gbvram should last you until they start porting ps6 games
@pdmerritt
@pdmerritt Год назад
it doesn't matter all that much imho. If you're like me and coming from a 1070ti even the 4070 would be a huge uplift. If it only lasted for 2yrs because of vram issues...I could sell the card with 1yr of warranty on it (so for a decent price) and buy from the newer generation that would, hopefully, have a better price performance ratio than this disappointing generation.
@trr4gfreddrtgf
@trr4gfreddrtgf Год назад
@@pdmerritt True, I don't think warranties carry over if you sell them used though. Pretty sure it's for the original owner only, might be wrong though.
@pdmerritt
@pdmerritt Год назад
@@trr4gfreddrtgf how would anyone know? Even if I purchased with a credit card and my name is on the reciept...all the person would have to say is that it was a gift. I also don't have to register for the warranty.....the reciept will suffice.
@inmypaants
@inmypaants Год назад
Thanks for testing these on GPUs with sufficient VRAM Brian. People arguing that you don’t need more than 8 and using 8 to test don’t realise games dynamically scale down textures. It’s fine if you have 8 and don’t notice or mind, but don’t argue that 8 is enough for midrange new GPUs, it simply is not.
@adriancioroianu1704
@adriancioroianu1704 Год назад
It is if you adjust some video settings here and there. Because only on 4k you see 8+ required on low and mid-range is not targeted for 4k, people don't buy a 3070 to play at 4k, it's ridiculous. On the other side there are people (mainly rx 580 users) who try to convince new buyers that 12GB cards are trash and they should go for 16GB or more even on 1440p because they saw a 4k benchmark (max settings) where the Vram went over 12. Its funny and sad in the same time.
@FenrirAlter
@FenrirAlter Год назад
@@adriancioroianu1704 Yes, i m most definitely buying a 600$ GPU, so I can just barely scrap by in 1440p while playing on high settings with rt on.
@inmypaants
@inmypaants Год назад
@@adriancioroianu1704 Brian didn’t show 1440p, plenty of people buy 3070 class GPUs to game at 1440p. That resolution also runs into issues at 6GB and it won’t be long until 8GB is saturated too. I don’t think people should sell and buy bigger GPUs mind you, I just think people should be vocalising to these companies that 12 is the minimum for midrange and really 16 is the value Nvidia and AMD should offer to really entice buyers.
@2roly2
@2roly2 Год назад
I got a 6750XT paired with a 5900X along with 32GB of ram . I do believe my 6750XT is 12GB . Now will Smart Access Memory help ?
@Sky_ray
@Sky_ray Год назад
You can you do 1440P high->low + RT/DLSS?
@ruxandy
@ruxandy Год назад
Great video! I would say that 12 GB VRAM at 1440p should be more than enough for the foreseeable future. I mean, sure, in the next couple of years there will probably be a new game which might require more than that for the absolute Ultra settings (Ultra textures, in particular), but I don't think we'll see a game where 12 GB of VRAM is unusable for High details anytime soon (it might happen when the next-gen consoles come out, but that's still a long way from happening). I for one have played The Last of Us with Ultra details @ 1440p on a Ryzen 7 5800X3D + RTX 4070Ti, and the experience has been absolutely flawless (had no crashes, and no stutters -> 65+ FPS for the 1% lows). So if this game runs great (and, as we all know, this title is the 'best' example of poor optimization), then I am not worried at all for the next 2 - 3 years. Fun fact: I've actually also played TLoU (in its entirety) on my backup PC, with an RTX 2060 @ 1080p/High details, and the experience, while not flawless, it still wasn't bad at all and very playable (and I didn't experience any crashes with this card either - must've been very lucky). On the other hand, 8 GB of VRAM is a whole different discussion. There were multiple signs throughout the past few years that... yeah, 8 GB VRAM wasn't gonna cut it anymore (especially considering the fact that new consoles came with 16 GB of unified memory). It's unfortunate that a lot of people did not listen and, even worse, they ended up spending 800+ euros for RTX 3070s and other cards like these.
@itsmorbintime6833
@itsmorbintime6833 Год назад
Do you think 12gb vram will be enough for 1080p for a long time?
@RuslanDoman
@RuslanDoman 10 месяцев назад
@@itsmorbintime6833 ?
@RuslanDoman
@RuslanDoman 10 месяцев назад
?
@benjaminmaher8896
@benjaminmaher8896 Год назад
My 1070 is still pushing but not for long I feel like with the trend games are adopting of being wildly unoptimized
@jGRite
@jGRite Год назад
Since you play DOTA2v how do you like 7.33?
@techclub8528
@techclub8528 Год назад
I would like to see a comparison for when you hit that VRAM limit the difference in stutter/ performance between Sata SSD or NVME GEN 2 vs GEN 3 VS GEN 4 and how improving SSD speed performance could help reduce problems when the graphics card needs to go back to storage to load more assets.
@DJ_Dopamine
@DJ_Dopamine Год назад
Agreed. Also what magnitude of difference the main system memory configuration makes to all this. (Memory speed/number of channels/timings/etc.) Also, if putting Windows/OS on one SSD with the games on another SSD helps much. I do this myself, but I have never actually measured the difference.
@danieljayasiri7739
@danieljayasiri7739 Год назад
Upgraded my old 6600k platform but still holding onto my 1080ti from evga 😅 upgrading soon but this time I'll probably go amd and take up the Sam as I've upgraded to ryzen 7 anyway... ❤ for all the 1080ti users still holding on
@firexz4185
@firexz4185 Год назад
I believe you shouldn't upgrade your gpu now cuz this 40 gen or 7000 Gen are horrible
@blinksone2768
@blinksone2768 Год назад
@@firexz4185 7000 is decent.
@Willbme4EVA
@Willbme4EVA Год назад
1080 ti is no joke, its really hard to upgrade without selling a great, great, grandchild at a -10% loss.
@danieljayasiri7739
@danieljayasiri7739 Год назад
@Willbme4EVA it's even harder when you plan on passing it down to your kid and you need to part with >1.5k nzd for your next gpu 😆
@firexz4185
@firexz4185 Год назад
@@blinksone2768 for the price they are not worth it
@TheSleppy
@TheSleppy Год назад
I think if this type of test was done for longer example 10 minutes of play vs 30 minutes etc, the VRAM utilization would be even higher. Sometimes a 5 minute benchmark doesn't give the whole story. I agree overall that 12GB is the new minimum, great video.
@mnemonic8757
@mnemonic8757 Год назад
Exactly. I don't think everyone has time for such tests. I remember the FFXV open world game. With the best textures, it ran smoothly right after turning on, but when you drove around the land and visited a city, the vram got clogged. I think this problem can occur in other titles, especially with open world, where you have to load a huge amount of data.
@greggmacdonald9644
@greggmacdonald9644 Год назад
Hardware Unboxed addressed this recently, their vid about this is well worth watching. I agree that 12GB is a new minimum, and I'd also say that 16GB is better.
@RicochetForce
@RicochetForce Год назад
@@greggmacdonald9644 Yeah, I'd say 16GB is what mid range cards should have. 8GB VRAM is stone dead and 16GB of system RAM is much the same.
@AWAONE07
@AWAONE07 Год назад
Thanks for your video, it really helps me !! Greetings from Argentina, hope seeing more content
@Andrew_Johnson_2973
@Andrew_Johnson_2973 Год назад
Could all these issue's be caused by Resizable Bar enabled.
@necuz
@necuz Год назад
Much better VRAM management than what these games are doing is possible on more recent hardware, but PC is lagging behind the upgrade cycle a lot for obvious reasons. Using a DX12 Ultimate feature called Sampler Feedback you can figure out which parts of which textures need to be loaded at what quality in order to render a scene, this would massively cut down on VRAM usage especially in open world games. That could further be combined with DirectStorage 1.1 to quickly load textures on demand. The kicker is you then need to set your minimum requirements to RDNA2 or Turing, since that was when support for these was introduced. That would be a bold move, but I do wonder how many of the people still rocking their old 1060 are actually buying $70 AAA releases in 2023?
@arenzricodexd4409
@arenzricodexd4409 Год назад
AFAIK RDNA 1 did not even support any DX12 ultimate feature.
@necuz
@necuz Год назад
@@arenzricodexd4409 Ack, you're right. For some reason had the impression RDNA1 also had rudimentary support for this.
@MrMeanh
@MrMeanh Год назад
The issue is that DirectStorage 1.1 will increase the load on the GPU (asset decompression done by the GPU etc.), this will 100% reduce the available compute for rendering the game. From what I've heard it's something like at least a 10-20% performance hit if you want to use DS+Sampler Feedback while rendering the game. This all means that I'm sceptical of Sampler Feedback being a good solution for reducing VRAM usage at the moment.
@necuz
@necuz Год назад
​@@MrMeanh It certainly isn't going to be free, however 20% sounds more like a situation like trying to run TLoU at Ultra on a 8 GB card where you're constantly running out of memory. So the question becomes, would you rather have smooth 20% less fps or the current stuttery mess? Additionally, almost all of the usual suspects among released games that have triggered this debate struggle to be GPU bound, so in many cases it might actually end up being essentially free...
@tomtomkowski7653
@tomtomkowski7653 Год назад
We have had 8GB for so long that I would say it is obsolete. I mean, would you buy a new $400 GPU like the 4060ti and be already forced to lower settings at 1080p? 12GB very soon will be the standard for 1080p gaming and if you want 1440p with RayTracing then you should have 16GB which should be the standard right now. 12GB should be a standard for sub $500 GPUs and 16GB should be a standard for GPUs for more than $500 and 8GB should be some entry-level cards for sub $150.
@klanas40
@klanas40 Год назад
It should, but it doesn't mean that will happen soon.
@brettlawrence9015
@brettlawrence9015 Год назад
Yeah buying a brand new gpu for the current prices and having to reduce settings is a joke. At 4k I could understand but not 1080p 1440p.
@Willbme4EVA
@Willbme4EVA Год назад
if we are makeing requests to GPU makers that says allot. They do not seem to be listening. But if they are? Give me a supplemental plug and play alternative for Vram. Preferably a slot stuffer for Xmas.
@bigturkey1
@bigturkey1 Год назад
12gbvram should last you until they start porting ps6 games
@brettlawrence9015
@brettlawrence9015 Год назад
@@bigturkey1 depends if you want ultra settings then no. 12gb will be for medium to high settings.
@mahnkemachine3281
@mahnkemachine3281 Год назад
Would have been nice to see a 1080p high for comparison. I mean, at this point in time, with this setup, who is going to play @1080p low settings? Also a RT @1080p ultra would have been nice.
@therealomartron
@therealomartron 8 месяцев назад
Do you think 4060ti 16GB would be worth it ?
@SKHYJINX
@SKHYJINX Год назад
Wouldnt it be more detailed to use different generations and compare baselines, as different architectures buffer more than others in various engines. 4090 low textures still seems to buffer more than say a 6gb 980ti at same settings... so differing architecture might hold revelations where just using flagship GPUs might skew results with their huge caching buffers. I hope for another round using 8GB gpus, not the flagships where game engines might buffer more just cos it sees a higher power gpu device ID.
@ABaumstumpf
@ABaumstumpf Год назад
With VRam it also heavily depends on the engine and the game it self how they treat it. Some engines by default just try to keep more stuff in memory if more is available, some games (Hogwarts, lastOfUs) just waste memory left right and center. Some games dynamically adjust LoD to stay without memory. Hogwarts can be run on a GTX 970 with medium-settings 1080p - just not with how the game is delivered. They really need to fix that crap when even the community has already fixed that with mods.
@winebartender6653
@winebartender6653 Год назад
"Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad. Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles. And the reality of the situation is that you shouldn't be limited by vram, you should be limited by the chips performance to then adjust settings that fit your fps needs. You shouldn't need to crank down settings because your vram buffer is too small.
@ABaumstumpf
@ABaumstumpf Год назад
@@winebartender6653 ""Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad." nah, people have shown that the game keep full resolution textures loaded for background objects that are only rendered at lowest LoD. "Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles. " aka - "we know it runs with 12 GB so just cram in everything even if that degrades performance on all hardware cause it still runs okish". "You shouldn't need to crank down settings because your vram buffer is too small." And you shouldn't need to crank down settings cause a tree 760 meters away rendered at lowest LoD is using more Vram than an NPC standing right next to your character.
@MozartificeR
@MozartificeR Год назад
Can you do 1440p? I wonder if there are any surprises there at 12gig, or 8gig?
@denvera1g1
@denvera1g1 Год назад
What i find weird is that at 4K high, you'd expect the VRAM to be more than 4x that of 1080p low, because bare minimum 4K is 4x the pixels of 1080p, so either only some of those textures are 4k, or they arent taking full advantage of the 4K resolution Edit, in the old days, 4K settings werent really 4K so instead of 4x the VRAM and 1/4th the performance, it was usually only 2x VRAM and 1/2 performance. My best example is my 4K60 TV and GTX 1080, dropping it down to 1080p120, running it at 60 or 120, it did not quadrouple FPS, and did not cut the RAM usage as much as i thought it would in games like Battlefield 4
@Johnithinuioian
@Johnithinuioian Год назад
"Hey, if it looks like it could be 4k, then I'm pretty sure that the average person will take it as 4k!🤑" ... "So, I paid 4k worth of money, but it's not actually 4k?🤬"
@denvera1g1
@denvera1g1 Год назад
@@Johnithinuioian "I miss the point, game devs are going back to the old ways of dumbing down the graphics now that cards dont have enough VRAM, and i just didnt get the joke"
Далее
How much GPU Memory do you REALLY need?
18:53
Просмотров 1,8 млн
Not Enough VRAM!!!
8:30
Просмотров 235 тыс.
ПОЛЕЗНЫЕ ЛАЙФХАКИ В PLANTS VS ZOMBIES!
00:45
Buying the Most EXPENSIVE Temu Items
23:44
Просмотров 454 тыс.
This PC shocked the CRAP out of me!
24:15
Просмотров 1,1 млн
Even After 4.5 years Raytracing is STILL BAD
9:24
Просмотров 184 тыс.
Can The Nvidia 4060 Ti 16GB Handle 1440p on Ultra?
12:24
Crowdstruck (Windows Outage) - Computerphile
14:42
Просмотров 113 тыс.
What Is VRAM? What It Does & How Much You Need
7:01
Просмотров 28 тыс.
Intel CPU Failure, It's NOT JUST your Gaming PC.
16:53
а вам так слабо ? )
0:10
Просмотров 4,4 млн
Double Stacked Pizza @Lionfield @ChefRush
0:33
Просмотров 73 млн
PRADO 250 - классная машина!
0:28
Просмотров 1,5 млн