Тёмный

Has the 12GB RTX 3080 been redeemed in 2024? 

Iceberg Tech
Подписаться 74 тыс.
Просмотров 23 тыс.
50% 1

Опубликовано:

 

22 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 403   
@Space_Reptile
@Space_Reptile 19 часов назад
as a user of a 8gb 1070 i find 12gb on a much faster card an insult
@ConnorH2111
@ConnorH2111 19 часов назад
But its still a much faster card 🤷🏻
@rem_0
@rem_0 19 часов назад
All of 10 series cards in general was goated, no wonder everything after feels like it fell off
@Xeonzs
@Xeonzs 19 часов назад
nvidia is sadly very greedy with VRAM, they use it as a crutch to push buyers into higher tier cards if they want more vram. If they went the AMD route and just gave people 16gb on mid-range cards already I think it would really benefit nvidia's reputation.
@JBrinx18
@JBrinx18 18 часов назад
I mean, you got a good card. The current gen and last gen lower end and midrange was woefully short
@eliasroflchopper3006
@eliasroflchopper3006 18 часов назад
​@@Xeonzs Nvidia doesn't give a single fuck about their reputation if it makes them money.
@okwhateverlol1983
@okwhateverlol1983 15 часов назад
the 3080 12gb was always a faster GPU regardless of the vram.
@BonusCrook
@BonusCrook 18 часов назад
The fact that the 3080 20GB was canceled was disappointing.
@lamhkak47
@lamhkak47 17 часов назад
And the Chinese yet again modded 3080 to have 20GB sure tells something
@Javier64691
@Javier64691 17 часов назад
I find it funny that NVIDIA purposely design them with just enough VRAM to the point that when they panic, their design only allows for double the amount, so 10GB is not enough to 20GB which is plenty enough, 8GB to 16GB, 6GB to 12GB
@BonusCrook
@BonusCrook 16 часов назад
@@Javier64691 why do you find it funny? Its depressing :(
@lamhkak47
@lamhkak47 16 часов назад
@@Javier64691 TBF, that's kinda how the memory chip works, increasing size in the power of 2 until the introduction of non-binary memories
@randysalsman6992
@randysalsman6992 13 часов назад
You can always get someone who knows how to solder and upgrade by removing the 10x 1GB memory moduales and soldering on 10x 2GB memory modules to get 20GB of vram.
@ZeroHourProductions407
@ZeroHourProductions407 19 часов назад
Silly youtuber. The rtx 30 series doesnt exist. The crypto bros made sure of that.
@pozytywniezakrecony151
@pozytywniezakrecony151 16 часов назад
Cmon, good value in 2024 used. Got 3090. Love it. Fallout 4 with 600mods or so 😂
@ZeroHourProductions407
@ZeroHourProductions407 16 часов назад
@@pozytywniezakrecony151 By sheer stroke of luck I scored _two_ Radeon VII cards for $200. needed some tlc in the form of dusting and replacing fans, true (~$40 for two sets, just to be certain). More surprised they still had their original warranty stickers intact. Though not as much as with the still impressive idle temps. Those same crypto bros still want $300 for a 3060. And unfortunately those listings crowd the marketplace and eBay algorithms like all those baity gaspy videos clogged the RU-vid algorithm for much of 2021 and 22.
@forog1
@forog1 16 часов назад
@@pozytywniezakrecony151 That example is a CPU heavy task being 600 mods. The GPU is just chilling in such an old game.
@pozytywniezakrecony151
@pozytywniezakrecony151 16 часов назад
@@forog1 hmm staying in base 83% GPU 28% i9 used. Lots of mods with textures 4k etc.
@antoinepersonnel6509
@antoinepersonnel6509 16 часов назад
Got my 3070 Founders at 300€ in may 2023 ! It's even a revision 2 and it got no use, second hand but new :)
@distrusts
@distrusts 15 часов назад
Now do one about the RTX 2060 6GB and RTX 2060 12gb 😳
@GewelReal
@GewelReal 12 часов назад
12GB one is a hidden gem. 90-95% performance of RTX 3060 for 2/3 the price (and it's quite a bit younger than normal 2060 which makes the VRAM chips dying issue virtually non-existent)
@AzalofForossa
@AzalofForossa 11 часов назад
​@@GewelRealthe extra vram also means it can use frame generation. Older cards with extra vram are still capable as long as you don't mind dlss + frame generation.
@DEL8TE
@DEL8TE 6 часов назад
Seconded, this is needed
@TheKazragore
@TheKazragore 6 часов назад
Also the RTX 2060 12Gb had more cores than the regular 2060.
@PaulRoneClarke
@PaulRoneClarke 17 часов назад
Top end cards 80 series should have had a minimum of 16GB from the 4x series onwards.
@Kenobi-vu9mb
@Kenobi-vu9mb 8 часов назад
Nvidia is pulling an Intel, running quad cores for eons. How did that work out for them?
@PixelatedWolf2077
@PixelatedWolf2077 7 часов назад
​@@Kenobi-vu9mbMeh. It hurt Intel but what it screwed over is the consumer. Alot of games are still single threaded or threaded with maybe a couple cores 😅
@eliasroflchopper3006
@eliasroflchopper3006 18 часов назад
How do you see the 10GB version outperform the 12GB version and not go: "hmm.... That's actually impossible, since the other card is better in every single metric."
@fartcruncher98
@fartcruncher98 16 часов назад
The 10gb EVGA card actually has higher base and boost clocks than the RTX 3080 12gb despite having fewer cores. It varies from game to game but some engines prefer faster clock speeds to more cores.
@markbrettnell3503
@markbrettnell3503 16 часов назад
His problem is using the 7500f cpu. It can't keep up. That simple. That's why proper bench testing gpu's is done with the 7800x3d.
@Dazzxp
@Dazzxp 15 часов назад
I am thinking it's possible that it's down to the cooler on the 12gb card or even contact between the GPU and the cooler, had that issue with my new Zotac 4070Ti Super 16GB, hot spot was high. Took the cooler off and applied thermal grizzly and dropped the temps 12C, not only that but the fan speed was also lower. So cooler running and quieter, win win in my books. Still it was a brand new card I shouldn't really have to do that.
@South_0f_Heaven_
@South_0f_Heaven_ 13 часов назад
@@markbrettnell3503try a 12th, 13th or 14th gen 700 or 900k. The stutter is from chippers. Monolithic Intels are superior.
@canihelpyou6524
@canihelpyou6524 13 часов назад
@@South_0f_Heaven_ sources?
@kaylee42900
@kaylee42900 17 часов назад
absolutely. With my 4070ti which has 12GB, I regularly get it up to 11GB+ at 1440p with full RT applied. With VRAM you won't often notice average fps changes but you will notice stutters as it swaps between shared memory and GPU memory. Or with texture swaps in certain games (like Control).
@stangamer1151
@stangamer1151 13 часов назад
At 1440p/Ultra/RT/FG you do need 12GB. That is why 4070/Ti has 12GB. Nvidia is smart and they gave players enough VRAM for their games, but not more than enough to use these so-called "midrange" GPUs for some other VRAM-heavy tasks (AI modelling, etc.).
@baronvonslambert
@baronvonslambert 6 часов назад
I'd just like to point out Control specifically has known texture issues, particularly with long texture load times, even with the semi-official raytracing patch/mod by one of the devs that lets you boost them by up to 150%. But it is also VRAM hungry. I pulled it out and dusted it off for Spooky Month this year and I couldn't run high textures at all, with or without raytracing on my 10 GB card just because it got close to the limit. Even dropping the resolution and other settings didn't help, it would load them in, but it would drop them to low after the first cutscene or loading screen, and it wouldn't even bother to load the textures for the shelter doors at all, they looked like a PS1 environmental texture lol Fortunately Control's medium is most other games high, especially with a little RT on top.
@tysopiccaso8711
@tysopiccaso8711 4 часа назад
@@stangamer1151 idk how u and 2 other people thought that nvidia did this because they are smart
@voteDC
@voteDC 14 часов назад
I have a RTX 3060 12GB, I'd quite happily swap it for the 10GB version of the 3080. vRAM isn't the be all and end all of what a graphics card can do, it's just one of many factors. More and more people are falling back into the the mindset of a game isn't worth playing if you can't run it at, or very near, maximum settings. On the flip of that more game companies are making games for PC gamers that mostly don't exist, targeting the highest end and relying on upscaling and frame generation to fill in the gaps. The GTX 970 lasted me near a decade of use, I expect the 3060 will do the same. Don't be afraid of lowering the settings.
@thomaslayman9487
@thomaslayman9487 9 часов назад
people could learn a lot from using a lower-end rig for a period of time. i used a 12100f + rx 6600 after owning a 5800x3d + 3080 for a good while, and you start to realise at some point that games are totally playable when they're not totally maxed out!
@Kenobi-vu9mb
@Kenobi-vu9mb 8 часов назад
"More and more people are falling back into the the mindset of a game isn't worth playing if you can't run it at, or very near, maximum settings" If we were talking about budget cards, that opinion would hold weight. We're not. These are 80 class cards. Not low end.
@tysopiccaso8711
@tysopiccaso8711 5 часов назад
@@Kenobi-vu9mb a game looking good at max and the scaling ending there < a game looking good at high/medium and having the scaling go higher for future hardware
@voteDC
@voteDC 59 минут назад
@@Kenobi-vu9mb The 3080 is now four years old and has been replaced as the latest and greatest, not only by the 12GB version but by a whole new generation of cards. Despite coming out two years later that 12GB version is still four year old tech. Is it then a surprise that they can't run games at 4k High? Perhaps it's time to lower the settings a bit.
@cosmic_gate476
@cosmic_gate476 18 часов назад
They are doing the same thing with the 5070, it will launch with 12GB and then an 18GB version will be released after the idiots…I mean customers spend 700 on a 12 gig card yet again
@ivann7214
@ivann7214 17 часов назад
Question.. what game right now uses over 12gb of vram at 1080p-1440p. I have never seen anything over 10gb without dlss. Hell I still have an 8gb 3060ti and I do struggle maxing out some games but for the most part since I only run 1440p I really wonder what game will use more than 12 if I'm doing semi ok with 8.
@takehirolol5962
@takehirolol5962 17 часов назад
​​@@ivann7214Do you play only eSports? Do you know the RE Engine and some Unreal Engine 5 titles?
@cxngo8124
@cxngo8124 16 часов назад
​@@ivann7214if you mod any game. Squad, Far Cry 6hd textures and RT, Avatar. I list is pretty small, but we are getting there.
@jibrilherrcherofhorny4448
@jibrilherrcherofhorny4448 15 часов назад
​@@ivann7214 civ 6 can use over 14go or even fill 16go
@Dominik12335
@Dominik12335 15 часов назад
@@ivann7214 ratchet & clank does for example, playing it at 1440p, native res, high settings and it sits at 11,3GBs of vram for me, tho i use RX 7700 XT and amd tends to utilize more vram than nvidia for some reason, probably the way the card handles the image rendering, cuz it is difrrent than for nvidia.
@slimshady1778
@slimshady1778 19 часов назад
As a guy who uses the greatest price to performance card of all time the fifty seven hundred xt, I feel attacked.
@eda2000-r8h
@eda2000-r8h 19 часов назад
Why u not type "RX 5700XT" 🤣?
@potatoes5829
@potatoes5829 18 часов назад
@@eda2000-r8h I too run a fifty seven hundred ex-tee
@quinncamargo
@quinncamargo 17 часов назад
I too use an -Are Ex Fifty seven Hundred Ex-Tee.😂😂😂
@TheStreamingGamers
@TheStreamingGamers 17 часов назад
I love my Arr Ex Five Seven Zero Zero Ex Tee Graphics Processing Unit from Advanced Micro Devices Incorporated
@rangersmith4652
@rangersmith4652 17 часов назад
@@quinncamargo My home theater personal computer is equipped with an Ay Em Dee Ryzen Five Thirty-six hundred ex and an Ay Em Dee Ahr Ex fifty-seven hundred ex tee. But it has only sixteen gigabytes of dee-dee-ahr-four random access memory.
@myBacau
@myBacau 19 часов назад
Even if equal in performance any of the 12gb versions found used on ebay should at least have less wear and tear from mining since they were released later when it became less profitable.
@snoweh1
@snoweh1 17 часов назад
LHR, meaning Low Hash Rate, meaning not conducive to mining versions of the 10GB exist. Although you could unlock it, it was finicky.
@Fr33mx
@Fr33mx 14 часов назад
Wear and tear are from poor conditions during usage and lack of maintenance. Such electronics don't really have a "resource" in its usual sense given proper care. So it heavily depends on how well the previous owner wanted the card to look/perform when he finally decided to get rid of it.
@Paulie8K
@Paulie8K 16 часов назад
3080 10gb owner here and the card is still delivering for me at 1440P. I also split my play time between older games from before 2020 and newer games up to 2024. The first tier of games I can run maxed out high refresh. The newer ones I may have to kick on DLSS but im totally fine with that. Still a great experience. Ill see next year if there's anything is worth upgrading to but I typically look for 100% performance to justify upgrading and anything at that level will cost a kidney.
@Thex-W.I.T.C.H.-xMaster
@Thex-W.I.T.C.H.-xMaster 16 часов назад
Thanks to my 3090 I'm not worried about VRAM.........
@cosmicusstardust3300
@cosmicusstardust3300 14 часов назад
same here on my 7900 XTX
@Fhwgads11
@Fhwgads11 10 часов назад
I should have picked one up, the 40 series are all too big to fit in my case (ITX life)
@blakecasimir
@blakecasimir 18 часов назад
It's an Ngreedia giving us scraps moment. They wanted you to buy the 3090 if you wanted more VRAM. Don't expect anything less than such tactics. YES some of us DO need that VRAM: non-gaming workloads.
@Sp3cialk304
@Sp3cialk304 17 часов назад
There are always give and takes with GPUs. AMD skimps on hardware also. They skimped on dedicated RT and AI cores. Basic hardware that a RTX 2060 and Arc a310 have. Of the past 12 games that Techpowerup has done performance reviews on only one uses over 11gb at 4k ultra. It uses 12gb at 4k ultra. 6 of them have RT always on. 8 run noticably better on Nvidia GPUs than AMD counterparts that are usually equal. Like 4070S vs GRE. 2 they tie and 2 slightly favor AMD. All had DLSS quality looking better than "native" TAA. Both companies skimp on something. Just hope what your card skimped on isn't what modern games are being developed around.
@Fhwgads11
@Fhwgads11 10 часов назад
I use a 5k monitor and could definitely use another 4gb of vram. (3080 12gb owner) I bought it over the 10gb version back in the day because I figured I’d need the extra vram and I’d already like more. There are games like ratchet and clank that run fine at 5k with medium textures but if I turn it up to high I max out the vram.
@kravenfoxbodies2479
@kravenfoxbodies2479 16 часов назад
The only company that has showed us affordable 16GB video card has been Intel, seen the new A770 for $269 on Newegg.
@ZERARCHIVE2023
@ZERARCHIVE2023 11 часов назад
DLSS is just a way to cope from the lack of optimization in PC game nowadays. Jeez
@PixelShade
@PixelShade 19 часов назад
In PCVR, 16GB is the only way to go. Especially when you are trying to hit native output on high-res headsets. I have a Reverb G2, which basically require 6000x3000 to get proper 1:1 output... Even if games use dynamic- or fixed resolution scaling I often hit 13.5GB or above. And it's so nice to not have any stutters in VR. It's a shame high performing GPUs like these have 10 and 12GB.
@koolkidgamr6260
@koolkidgamr6260 14 часов назад
Was about to say that. PCVR ATM is really hard to get into. I got a A770 LE just for the VRAM/cost years ago, thinking it was temporary. It's 2 years later and there's STILL barely any >12gb cards under $600 or so, especially Nvidia cards (NVENC VR). Waiting on RX8000.
@__-fi6xg
@__-fi6xg 2 часа назад
@@koolkidgamr6260 amd is a good alternative, i can play cyberpunk in vr with a 6800 xt on my quest 2 with headtracking and all and id recommend 7800 xt for the improved raytracing or wait for 8000 series. Nvidia stagnating rn because lack of competition the last 2 generations.
@Fantomas24ARM
@Fantomas24ARM 19 часов назад
I'am happy with my 10g version.
@puffyips
@puffyips 19 часов назад
But it literally could use 16gb like the 6800xt, and yes I’ve seen that much used. Rust from 2013 uses more than 12gb not even on ultra
@DragonOfTheMortalKombat
@DragonOfTheMortalKombat 18 часов назад
​@@puffyips use or allocate ? Check carefully.
@josemendezfr
@josemendezfr 18 часов назад
@@DragonOfTheMortalKombatthis. Most people freak out about allocated memory because it can max out the buffer. But it’s really the utilization that matters.
@snoweh1
@snoweh1 17 часов назад
@@puffyips this is what happens when you watch number that you don't understand, instead of playing the game. Your brain starts to rot and you haluscinate a story. Many such cases!
@rangersmith4652
@rangersmith4652 17 часов назад
@@snoweh1 Yes, many gamers spend way too much time looking at performance numbers and far too little just enjoying games. That said, my 16GB 6800XT never causes textures to randomly disappear or characters to look like something out of Minecraft. I just enjoy the game maxed out and don't ever worry about such things.
@mrN3CR0
@mrN3CR0 17 часов назад
When i used to work for CEX i upgraded from my 8gb rtx 3070 to a 12gb 3080 just because the earlier was struggling at 4k ultra due to its limited vram... Funnily enough now i am finding certain titles now i am having to drop to 1440p at ultra due to 12gb not being enough LOL Performance wise though it definitely does hold up
@randysalsman6992
@randysalsman6992 13 часов назад
If you're gaming on a 4K display it would probably look better if you gamed at 1080p intead of 1440p because of the 1/4 scaling. Meaning 1080p goes into 4K (4x) evenly where the 1440p does not.
@lesmiserable4594
@lesmiserable4594 13 часов назад
Im using rtx 3080. Its not as bad. All you need to do is switch to FSR + Frame generation. Sure its not as sharp quality as dlss, but it helps boosting the fps by almost double. Especially in 4k. Thanks though Iceberg for detailed review
@adriandabrowski4990
@adriandabrowski4990 13 часов назад
the rtx 3080 12gb should have been the only 3080
@redline589
@redline589 8 часов назад
Why it's just added cost for no benefit.
@adriandabrowski4990
@adriandabrowski4990 8 часов назад
@@redline589 I meant for the same price, like the 3070 ti also should have more than 8gb, but we know it’s nvidia’s way to make their cards obsolete faster
@recklesssquirel5962
@recklesssquirel5962 15 часов назад
I snagged my ftw3 3080 12 gig right after the mining boom went bust and I've been using it to play at 4k and run an LLM on my machine. Finally got a game, Space Marine 2, where I had to take a few settings off ultra to get a stable 60 fps, otherwise it was a solid 48. It's a beast of a card that I plan to have for a few more years, and overclocking is still on the table to boot. I can easily get it to match 3080ti performance.
@chugent3361
@chugent3361 14 часов назад
I just got my hands on a Vega 64, upgrading from an R9 Nano. 8 gigs is definitely still enough to play games made by devs that actually care about players
@J4rring
@J4rring 15 часов назад
tbh from what I'm getting from this is newer dev companies are being more sloppy with optimizations and using excess assets for minimal fidelity gains while hiding behind "building for the future" promises. while games like black myth wukong and supposedly starwars outlaws are heralded as intense visual experiences, I find it difficult to argue why they are more demanding to run than a game like Cyberpunk 2077 which still has amazing visuals and beautiful ray tracing all within an open world environment -from a game that came out almost 4 years ago. on a 3080 10gb Cyberpunk can run at max settings 1080p close to 100 FPS without ray tracing and still around 60 FPS with custom settings for ray tracing depending on the rest of your hardware. While I would give Wukong a pass as it was developed by a smaller company from china thats more accustomed to mobile games, star wars outlaws came out from one of the biggest game development companies in the industry (despite its short comings). After a little digging around around, it all really just seems like dev companies are spreading their teams too thin across story, mechanics, and setting development, not investing enough into quality and optimization control, and are trying to shove off the responsibility to hardware developments. With Epic releasing Unreal 5.5 for devs with much better shader optimizations, I'm hoping the industry starts moving towards 3rd party game engines rather than forcing the devs to do virtually everything from the ground up. I am a bit worried that Epic has basically all the power in the premium game engine space with these recent developments, but if Valve, and Ubisoft can shift their gears I wouldnt be surprised if they came up from the dust as well
@DarkReturns1
@DarkReturns1 12 часов назад
You made the cards CPU limited, you'd see more of a difference with a high end CPU
@Uthleber
@Uthleber 10 часов назад
Hogwards Lecacy breaks the 10 Gb on 1440p
@tysopiccaso8711
@tysopiccaso8711 4 часа назад
turn the textures down
@Uthleber
@Uthleber 4 часа назад
@@tysopiccaso8711 i think, a 3 trillion $ company can afford a few GB more Vram for pennys.
@Uthleber
@Uthleber 3 часа назад
@@tysopiccaso8711 just stfu and give those people the vram amount they are paying for
@craciunator99
@craciunator99 18 часов назад
Im perfectly happy with my 6800xt compared to anything from nvidia, i got it at an insane price, and frankly I dont want to support nvidia at this point.
@michaelmcconnell7302
@michaelmcconnell7302 14 часов назад
Both were available when i got my 3080. I never thought the 2gb made the difference, but the bus width was more attractive... $100 difference between a zotac 10gb and msi 12gb... seemed like an easy choice in oct 2022.
@DrearierSpider1
@DrearierSpider1 17 часов назад
Depends on what you mean by redeemed. I paid $630 for my RTX 3080 FE in January 2021 (I had a Best Buy discount), and it's served me very well for almost 4 years now. There's almost no chance I would have gotten a 12GB model anywhere near that price by the time it came out, and I haven't been limited by the 10GB since I mainly play Elden Ring. That said, there's almost no denying that you'd see a tangible benefit in many games from the extra 2GB using high resolutions and more demanding settings.
@yassertechtips5568
@yassertechtips5568 16 часов назад
The more you play on the 10gb in a game the worst the fps gets
@randysalsman6992
@randysalsman6992 13 часов назад
No it doesn't. Think there's something wrong with your build if that is happening to you, I game on a RTX 3080 10GB and never have any trouble.
@yassertechtips5568
@yassertechtips5568 13 часов назад
@@randysalsman6992 try hogwarts legacy or god of war ragnarok at ultra settings, you're fps will be unstoppable after few minutes of gaming or when you get to a new place in theap
@OgoTheKeeper
@OgoTheKeeper 12 часов назад
@@randysalsman6992 There's nothing wrong with his build. It's called poor memory management in both the OS and the game. I can verify it as I've also tested the 3080 10gb
@Redald
@Redald 12 часов назад
Don't have this problem either it's a 2-2 I guess
@ps5professional
@ps5professional 11 часов назад
The only game game I've run into VRAM issues on a 10GB 3080 was Ratchet & Clank: Rift Apart. Sometimes it runs just fine in 4K DLSS Quality with RT reflections, AO and high textures, while other times it struggles even without RT and on low textures. TLoU used to problematic too, but ever since they've patched it it's all fine. Biggest surprise would be Silent Hill 2, this thing runs it as fast as a 7900XT... crazy. My favorite GPU right after a 1080Ti
@mr.burner8797
@mr.burner8797 2 часа назад
The significant difference in some games that don't even max out the 10gb vram is due to the increase of bandwidth (320bit vs 384bit bus). The difference will only increase from here, in 3 years we'll easily see 15-20% difference in lots of games at 1440p.
@anadventfollower1181
@anadventfollower1181 17 часов назад
Best use scenario for these types of cards: Gaming at 1440p, AI, 3D Rendering for small or personal or fragmented projects, Testing, future proofing for high-end media. Bitmining and other misc.
@alexandroromio5976
@alexandroromio5976 19 часов назад
In addition to my gaming hobby I'm a 3d modeler/renderer and this summer I bought at a bargain price a build with an i9 9900k and an rtx 3090, while knowing that this is really dated hardware I can't complain about anything, I can play everything, in some cases I use Lossless Gaming to generate frames but in general the 24gb of vram allows me to increase a lot post processing and resources. In Blender and 3ds max (my favorite 3d suites) the 24gb of ram makes me feel like I'm breathing again after a long immersion in rendering... I am coming from a 6gb 2060. I honestly think you don't need to buy the newest card, but you should buy what works for your needs and purposes. If i needed a card only for gaming i think i was going to buy an rtx 4070 or a 3080.
@rangersmith4652
@rangersmith4652 17 часов назад
This. Buy a tool that serves in such a way that you never have to compromise your needs to accommodate the limitations of your tool.
@SyndicatesFollower
@SyndicatesFollower 18 часов назад
I found a solid used 3080 12gb FTW3 for $370 last year and with it being the highest end sku along with the strix and no more EVGA (RIP) knew I had to have it, replaced a 3080 10gb EVGA XC3 and sold it to a friend for $300. The combined additions of VRAM, cooling capacity and power limit allows this card to outshine the XC3 10gb by ~10% tune v tune. Pairs very nicely with an Odyssey G7 (1440p 240hz) 32".
@aephix73
@aephix73 13 часов назад
The facts are that the majority of people still play in 1080p. The jump to 4k is a huge mark up in hardware. As far as ray tracing, if you actually play the game, it really not noticable. Everyone still frames 4k w/ ray tracing and say, Ooooh, look how pretty. but if you actually play, it really makes little to no difference. The push for 4k and ray tracing is a corporate agenda for us to spend more, when in reality, our hardware is still enjoyable to play the games we play.
@carlestrada
@carlestrada 9 часов назад
my god I remember owning a used 3090 FE for about a year and seeing the power spike up to 400 watts in-game was crazy. Nvidia really did let these cards loose on the samsung node. A few reasons why I replaced mine with a 4080S FE was that the GDDR6X in the backplate side of the FE 3090 was cooking itself to death even if the card was both undervolted and below 65C (mem temps were regularly reaching 88-95C). The coil whine was so unbearable that it stopped me from using my studio monitors for gaming because the coil whine was leaking into the speakers even through a DAC. The 40 series cooler design blows my old FE away with how cool and quiet it is, while sipping less power (200 watts on average in 1440p maxed vs 400 watts in BG3). Fans were quieter because the memory wasn't hitting 100C in extended sessions, and the best part, no coil whine! mine wasn't the best undervolter, but my V/F curve at the time for my 3090 FE was 1785mhz/.800mv and it was running flawlessly until I got my hands on a 4080S FE. OC doesn't scale well with these cards unless you start to push into the 450 watt + territory so keeping clocks locked at around 1800-1860 was the sweet spot between temps, power, and not hitting power limit. Cuts out the most important thing ever that plagues the 3080/3090 series as a whole unless you have a new vbios flashed. Hitting power limit. Hitting power limit means the card will slow down to go under said power limit, and will be felt in-game as a stutter because the card downclocks to hit said power limit i.e. 3080 at 320W and 3090 at 350W at 100% power slider in afterburner.r I distinctly remember the boosting algorithm of the 30 series is that once the card hits 52C, if there is power headroom and cooling headroom, it will add another 15 mhz to the boost clock as long as it doesn't hit 62-65C. So if you set a V/F curve at 1800mhz at .800mv and the card hits 52C, it will go to 1815mhz if the card has power headroom.
@napowolf
@napowolf 15 часов назад
AC is developed on Ubisoft Anvil if I'm not mistaken. Snowdrop is used by Massive.
@ChimobiHD
@ChimobiHD 12 часов назад
This video is very timely. I just picked up two 3080 cards (10GB and 12GB0 for $340 each. Going to pair them with a set of 5800x CPUs i got for $150. Going to replace my computers with 1700x and 1080ti from 2017. I play mostly old games at 1080p so no regrets here :)
@CarlosCamposvc92
@CarlosCamposvc92 9 часов назад
I love my 3080 12gb Gaming Z Trio, with adequate settings and DLSS chugs through everything at 1440p and 4K. Gonna keep it till it dies.
@KevinGoldLVL
@KevinGoldLVL 13 часов назад
Thank you for this, as a early adopter of the 3080 10gb, I was looking for this for a while, still glad I din't choose the 3070 back then but still think the 3080 deserved more vram
@isuckatbattlefield
@isuckatbattlefield 8 часов назад
been using the 3060 12gb for about a year now and it still slaps AAA with 4k on even though its technically a 1080p card. i love it
@nartboglin9046
@nartboglin9046 16 часов назад
i got my 10gb like a week before they came out with the 12 gb and i was pissed
@randysalsman6992
@randysalsman6992 13 часов назад
Shouldn't have been in such a rush to upgrade.
@censoredialogue
@censoredialogue 13 часов назад
i love my 3080 12 GB, and with my 5600X i haven’t really had any issues outside of the PL update to CBP2077 being a lot harder on my CPU. i feel like i’ll be keeping this GPU for a few more years, but will prolly upgrade to a 5700X3D for GTA VI
@khaaaaaaaaaannn
@khaaaaaaaaaannn 18 часов назад
I love my 3080 12Gb, was a huge upgrade over my old GTX 1080
@raydmari9837
@raydmari9837 11 часов назад
Love your vids! Been watching for months now and just twnated to say they're a comfort food to me as well as super interesting, kudos.
@gerardw.7468
@gerardw.7468 4 часа назад
Price. 10GB models were significantly cheaper than the newer 4070/70 Ti at the end of the crypo boom. Getting a used one in good condition at the beginning of 2023 like i did for $350-400 was a absolute steal. Only thing u need to deal with, is the higher power draw, (but 8pin pcie means would work with an older psu), and lack of DLSS 3 FG (which is redundant if the game supports FSR 3.1). The 3080 is definitely up there with the 1080 Ti as on the GOAT gpus.
@victorsegoviapalacios4710
@victorsegoviapalacios4710 14 часов назад
I have the EVGA FTW3 12GB version of the 3080, then I bought the 4090.... but the 3080 is still in my collection, found memories
@AvroBellow
@AvroBellow 16 часов назад
I remember when Jensen pulled these out of his oven and I laughed hysterically when I saw that the RTX 3080 only had 10GB of VRAM. Then Red October came and I grabbed an RX 6800 XT when I saw the performance and the 16GB of VRAM. You see, back in the day, during the first mining craze, cards like the GTX 1060 Ti and RX 580 were near $800CAD. Then something inexplicable happened... Out of nowhere, Newegg suddenly had a bunch of brand-new Sapphire R9 Furies. These cards were released two years prior and I was admittedly shocked because they were less than $400CAD. I was admittedly hesitant because I couldn't remember how the R9 Fury performed so I started reading reviews and discovered that the R9 Fury, despite being two years old, was faster than the RX 580. I quickly discovered that the R9 Fury was a monster when it was released, faster than the GTX 980. The card had so much GPU horsepower at the time that it could literally play anything at 1440p Ultra at 70+FPS. Unfortunately, ATi's experiment with HBM gave the R9 Fury an Achilles' heel, the same Achilles' heel that Ampere cards have. nVIDIA made the choice to use more expensive GDDR6X VRAM which meant that they had to give less of it on their GeForce cards to be even somewhat competitive with Radeon. nVIDIA also knew that most gamers aren't smart enough (or just too lazy) to actually research their purchases and just tend to buy nVIDIA by default. Admittedly, nVIDIA was 100% correct in their assessment so they didn't worry too much about it. Just like the aforementioned R9 Fury, having fewer GB of more expensive higher-speed VRAM instead of more GB of more economical VRAM that is slower was proven to be a mistake on the R9 Fury and will prove the same on Ampere cards. Some people like to talk about how "superior" GDDR6X is compared to GDDR6 but it just hasn't shown to make any real difference. If you want to talk about superior VRAM, HBM was in a league of its own with a colossal 4096-bit bus width. Compare that to the 384-bit bus width found on the RTX 4090 and RX 7900 XTX cards of today. I am willing to bet that if you were to take a pair of RX 580s and somehow graft the 16GB of GDDR5 that those two cards have onto something like an RTX 3070 Ti, those 16GB of GDDR5 would out-perform the 8GB of GDDR6X in modern titles and give the RTX 3070 Ti a new lease on life. Sure, the R9 Fury's HBM was impressive, especially when it could run Unigine Superposition at 4K Optimised despite a warning that it didn't have enough VRAM to run the test correctly. Unigine clearly hadn't considered that 4096MB of VRAM on a 4096-bit bus could do things that 4GB had no business being able to do, but despite this, HBM isn't magic and could MAYBE behave like 6GB of GDDR5 because of it's incredible speed. This means that 8GB of DGGR5 was better than 4GB of HBM for gaming. This myth that a lot of GeForce owners fell for (and they do seem to fall for a lot of myths) is that GDDR6X is somehow going to make your GeForce cards superior to a Radeon that "only has the inferior GDDR6". I'm pretty sure that the truth is more like AMD probably bought some GDDR6X from Micron and sent it to ATi in Markam to play with. After considerable testing, ATi would discover that the difference in performance and efficiency between GDDT6 and GDDR6X was minimal at best and not worth the extra cost. ATi knows its market and Radeon owners aren't dazzled by frills, we want maximum performance-per-dollar (which, really, ANY user should want). Micron is the exclusive manufacturer of GDDR6X (and probably GDDR7X) while standard GDDR6 is made by Micron, Samsung and SK Hynix. VRAM is a commodity and the more competition you have in the marketplace, the better the price will be. Since Micron has no competiton for X-rated VRAM, their price remains high. Since GeForce owners have no issue getting fleeced for useless frills, nVIDIA, also knowing their market like ATi does, chose to get more profit from the use of GDDR6X and who can blame them? The proof is in the pudding however as the use of GDDR6X has not translated into any real performance advantages for GeForce cards over their Radeon rivals. Let's take a look at the rankings, shall we?: 1st Place - GeForce RTX 4090 with GDDR6X 2nd Place - Radeon RX 7900 XTX with GDDR6 3rd Place - GeForce RTX 4080 Super with GDDR6X 4th Place - Radeon RX 7900 XT with GDDR6 5th Place - GeForce RTX 4070 Ti Super with GDDR6X 6th Place - Radeon RX 7900 GRE with GDDR6 7th Place - GeForce RTX 4070 Super with GDDR6X 8th Place - Radeon RX 7800 XT with GDDR6 9th Place - Radeon RX 7700 XT with GDDR6 10th Place - GeForce RTX 4060 Ti with GDDR6 We can see from this chart that it has been an almost perfect competition stack going back and forth from place to place with both red and green having five of the top-ten. It's also interesting to note that while nVIDIA does have the most performant card in the top-10 with the RTX 4090, they also have the least performant card in the top-ten with the RTX 4060 Ti. It's also interesting to note that the RTX 4070 Super is faster than the RX 7800 XT. This is because the RX 7800 XT is faster than the original RTX 4070 and while the RTX 4070 Super uses GDDR6X VRAM, so too did the RTX 4070. All of this just goes to show you that having fewer GB of faster X-rated VRAM doesn't translate into any real performance advantage but having less of it can (and will) become a serious hindrance to what you card will be able to achieve in the future. People like to talk about bottlenecks and this is no different. My R9 Fury was held back by its lack of VRAM and its incredible GPU horsepower (for the time) was relegated to high-FPS 1080p gaming far too soon. I bought it because it was half the price of the RX 580 during the first mining craze (because it wasn't efficient enough to mine with) and so I could forgive myself for taking a 4GB card in 2017. After all, in 2015, when the R9 Fury came out, 6GB was considered to be high-end, similar to how 16GB is looked at today. However, I feel sorry for the dumb schmucks who bought the RTX 3070 Ti only to discover shortly after that they had only purchased an overpriced high-FPS 1080p card while those who bought the 3070 Ti's rival, the RX 6800, are still happily gaming away at 1440p.
@Bokille
@Bokille 14 часов назад
Brah 😲
@samcadwallader2899
@samcadwallader2899 17 часов назад
I felt myself lucky to buy a 3080FE direct from NVIDIA at 650 quid MSRP at the height of the madness. The card is still a beast today albeit only at 1440p. I'm guessing the same amount of money now isn't going to provide any sort of massive upgrade. I should add I think it mined half it's cost back in crypto.
@Danclements99
@Danclements99 18 часов назад
Any reason you don’t use a 7800x3d or anything of that class? How does the 7500f hold up for cpu bottle necks?
@MrIvonJr
@MrIvonJr 18 часов назад
7800x3d is 4x the price of the 7500f for only 25% more performance. 7500f is similar to 5800x3d
@IcebergTech
@IcebergTech 13 часов назад
So, I’ve been struggling with this for a while. When I started out, my schtick was that I tested old GPUs on cheap modern PC hardware. The original “reasonably priced gaming PC” got upgraded to the “moderately priced gaming PC” when I started reviewing better GPUs, and now I’m at the point where I’m kinda ready to test brand new GPUs, but I also don’t wanna let down the people who are used to me having a humble test rig. To answer your question, the 7500F holds up really well for the price. I’ll be reviewing the 7800X3D soon, and I’ll have the 7500F figures updated for comparison.
@GrumpyWolfTech
@GrumpyWolfTech 13 часов назад
@@MrIvonJr lmao, the 7500f is NOWHERE near the 5800x3d
@MrIvonJr
@MrIvonJr 11 часов назад
@@GrumpyWolfTech in gaming yes it is. I'd recommend looking into benchmarks you'd be surprised how much of jump they made from am4 to am5
@Uthleber
@Uthleber 10 часов назад
7500f is about close to the 5800X3D. Unless you have a 4080 or better, 7500f is fine. AM5 P/P king.
@Revoku
@Revoku 10 часов назад
I brought a 6800xt instead of a 3070 back then for the reason of needing more vram..today I'm running into games that are using 16gb at 1440p on high settings(some cod MWIII maps for example have so many textures that they fill it) nvidia doesn't want cards that have the power to continue to be able to. lacking VRAM is a killer, no one wants to play with stutters at key moments
@solocamo3654
@solocamo3654 8 часов назад
If you mod textures at all in games that extra 2gb is gold. Textures are the biggest visual difference in most games, and I refuse to lower them.
@JamesSmith-sw3nk
@JamesSmith-sw3nk 18 часов назад
I paid $2300 Cdn for a new rtx 3080 10gb when crypto was hot. I only replaced it when it would run out of vram and the fps would drop like a rock on some player made arcade maps in Far Cry 5 at 4k. I then got a 3090 and the fps no longer tanked on those maps.
@Lauterec
@Lauterec 17 часов назад
I sold my 4090 for $300 more than I payed for it and used the $300 to get a local 3080 10gb. I game at 4k but I’m not playing any sort of demanding games so the 3080 is more than adequate. Maybe I’ll swing for the 5090 when that comes out.
@Thex-W.I.T.C.H.-xMaster
@Thex-W.I.T.C.H.-xMaster 16 часов назад
So you sold a 4090, for a 3080.... 😅... ok
@Uthleber
@Uthleber 10 часов назад
@@Thex-W.I.T.C.H.-xMaster and he is totaly right depending on his needs. buy what you need, not to impress random NPCs.
@m8x425
@m8x425 7 часов назад
not every one need a blazing Graphics card. The 3080 hold up well at 4k with older games. But..... Good Luck on getting a 5090.
@Lauterec
@Lauterec 6 часов назад
@@m8x425 Hoping to snag one at MSRP with the waitlist thing at Best Buy. After that I bet it will be way over MSRP and unavailable for months.
@Danixik
@Danixik 12 часов назад
glad amd is not so greedy and actually gives us sub 500 16gb vram GPU (7800xt)
@burwood69
@burwood69 14 часов назад
I went from a 1080ti to a 3080 10gb, it was odd to go down in memory, I've only had the memory become an issue with Hogwarts 1440p. It was too much and I had to scale to 1080. I'll be waiting for 5000 or 6000, the 3080 is great, but not my favorite card.
@ArikGST
@ArikGST 13 часов назад
I am still running an RTX 3060 TI 8GB, playing at 1440p, and I havn't played any game that really maxes it out (havn't touched RE4 yet, who knows). Most of the games I play tend to be either slightly older (still catching up on some big games now that they are discounted), or indie games, so I don't see myself needing a newer GPU for a couple more years tbh. Only thing..... I need a new shiny, so maybe there is a new GPU on the horizon just for that xD
@khairuddinali727
@khairuddinali727 18 часов назад
Now I'm confused why the RTX 3070/3070 Ti was so crippled by its 8GB of VRAM if the RTX 3080 works well with 10GB. Or are both RTX 3080 models equally crippled?
@classicgameover
@classicgameover 18 часов назад
no, both 3080 models are actually fine today, they're pretty much the card you need these days to 100% sprint games, versus the 3070 and 3070 Ti that are currently within the realms of 99% (at least for now)
@snoweh1
@snoweh1 17 часов назад
You're confused that a better card runs better? We're gonna have to get Columbo on the case, because this one truly is staggering.
@distrusts
@distrusts 15 часов назад
3070 is 30% slower then a 3080 and it having only 8gb of vram wouldn't help when it runs out.
@semibigbraingamer
@semibigbraingamer 18 часов назад
I recently picked up a mining 3080 ti on the used market for the equivalent of ~320£ or ~420usd. It runs very hot at 85°C but sooo worth it
@inspirerush
@inspirerush 15 часов назад
the gpu performance uplift must compensate for your cpu single core performance
@CompatibilityMadness
@CompatibilityMadness 13 часов назад
Try to get 8GB 3070 Ti as "is 8GB any good" point. That way we know at what point 10GB helps A LOT - running out of VRAM simply feels different than plain perf. difference between models would show. As for "too big for it's own good" - I like Founders card sizes (even if you have to pad mod to get G6X temp under control, along with 0.8V undervolt on cores).
@rch5395
@rch5395 19 часов назад
I feel like more companies should "unlaunch" bad products like corncord.
@Madu-p2s
@Madu-p2s 10 часов назад
Got my 3080 10gb at launch, upgraded my screen just after, didn't went 4k coz 3080 was never a true 4k gpu. Still got it paired with 5800x3d, doing well but will replace it next year...
@zbigniew2628
@zbigniew2628 17 часов назад
If you can, please test rtx3070 vs rtx4060ti in 4k with and without DLSS in games where they can play at +30 FPS or even close to 60. I would like to know how the new one maintains performance in older games like RDR2. TEST IT WITH OPTIMIZE SETTINGS like hardware unboxed settings or digital foundry.
@solocamo3654
@solocamo3654 8 часов назад
So glad I bought a 6900XT for only $100 more than a 6800XT and a like $2-$300 cheaper than the 10gb 3080's were going for during the mining craze.
@Smegmadonis
@Smegmadonis 10 часов назад
Would a 1080/ti be powerful enough to emulate ps4/xb1 generation of games? Im looking to build an emulation machine as my first PC. I have a PS5 and honestly nothing that has come out/coming out has interested me
@AshtonCoolman
@AshtonCoolman 16 часов назад
I was playing No Man's Sky in VR yesterday and it used 14GB of VRAM on both my 7900 GRE system and 4090 system. That game is relatively lightweight comparatively. 12GB just ain't getting the job done in 2024, but 12 will always be better than 10🤷‍♂️
@justhitreset858
@justhitreset858 13 часов назад
I've been saying VRAM isn't some massive concern and more often than not the core will be the limiting factor. I'm not saying you shouldn't get more for your money, but I'd rather have a GPU that's cheaper, uses less power, and is easier to cool than one with more VRAM than the core can really do anything with. Eventually this will change overtime when the next generation of consoles come out and likely will have more VRAM available so games will be made to utilize more. But for now, VRAM really isn't much of an issue.
@DJ-fw7mi
@DJ-fw7mi 10 часов назад
20GB on my XFX 7900XT Black Edition.... I'm good for a long time. Coming from the Goat GTX 1080ti 11GB. FYI 5000 series will stiff you, on everything they have no competition. I will say the move from my 1080ti to a 7900XT was easy. I am impressed with the drivers and features of AMD. You don't need FSR with Anti lag and fluid frames. I turn that on with 250+ mods, reshade, 4k texture packs, and ray tracing in Cyberpunk 2077 @ 4k high settings optimize. I get easy 60+ fps no lows below that, no ghosting, screen tears, or stuttering.
@grimmpickins2559
@grimmpickins2559 5 часов назад
Weirdly modern niche for you, LOL. But I learned something tonight, thanks.
@GoldDmg
@GoldDmg 19 часов назад
Why does the 10 GB in the thumbnail seems to be off? Is that on purpose? Besides that I really like the reflection and the thumbnail in general, good job :)
@IcebergTech
@IcebergTech 18 часов назад
My first thumbnail idea sucked (it was a closeup of a 3080 FE heatsink with numbers 10 and 12 carved onto it. Wasn’t very legible) so I kinda threw this together from an image I made for the video. I only had three hours sleep, so I’m blaming that 🥱
@GoldDmg
@GoldDmg 18 часов назад
@@IcebergTech Fair, get some sleep :)
@colossusrageblack
@colossusrageblack 12 часов назад
I use my 3080 12GB for a living room PC that just plays games at 4K. Still going strong, but thank Huang for DLSS.
@jonpeley
@jonpeley 7 часов назад
This was the second (then third) most powerful GPU from the past generation. Now it has less or the same memory cache as the current midrange dopped up 60 series models. Nvidia being Nvidia.
@JakeTheGhostYT
@JakeTheGhostYT 12 часов назад
Honestly, as a rtx 3080 owner, i would love a breakdown of how the 3080 compares to the rx 7900 and rx 7900xtx. Those cards have great price points, but I would love to see how they actually compare to the 3080. Is it worth the upgrade?
@IcebergTech
@IcebergTech 10 часов назад
I'm currently gathering data for a "second gen RT" video. I haven't decided exactly which cards will feature in the final product yet, but so far I have benchmarks from the 3080 10 & 12GB 3080 Ti and 7900 XT.
@Uthleber
@Uthleber 10 часов назад
6800 XT owner (which is 3080 perf) had 2 weeks a 7900 XTX. not really satisfied. to much power cons and i dont care if i had eg 60 fps with the 6800 XT and 100 with the XTX.
@rangersmith4652
@rangersmith4652 17 часов назад
Nope. Love your videos, Iceberg, but in this case I disagree. The 10GB card was a mistake from jump street. If today you can find instances when 10GB is not enough, then it never was on a card that listed for $700 just four years ago. (Many of us made the point at the time.) If 10GB was _ever_ really appropriate for such a card, and the gap between the 10GB 3080 and the 12GB 3080Ti didn't need filling, why did Nvidia release a 12GB version? Answer: The 3080 should have had at least 12GB all along. The 3080Ti should have had 16GB. And why did the 3060, released a bit later, have 12GB, 2 more than the 3080 and 4 more than the 3070? Value is often neither progressive nor linear. There is a floor that must not be breached. Sometimes 10GB is not enough, but 12GB is. In those cases, the 10GB 3080 is valueless because it fails to meet the need--it's below the floor. If one has hard and fast criteria for a tool, one buys a tool that meets all those criteria. If he compromises his criteria to make the lesser tool suffice, then he has made a deliberate choice to lower his floor. I don't care a lick about ray tracing because the performance hit is always too high and the images are overblown and distracting. But I do like most of the graphics settings turned way up. So when prices returned to some semblance of sanity, I upgraded my 2080Ti to a 6800XT. With 16GB available, I never have to fiddle with settings to avoid overflowing the buffer. VRAM is just not a thing I ever have to worry about.
@Fr33mx
@Fr33mx 14 часов назад
I've read somewhere that the amount of VRAM is not just decided on a whim, there are certain limitations and restrictions when choosing the capacity. The 3060 having 12gb was, and I'm paraphrasing it right now as best as I can, was due to its memory bus being 196 and not 128. That's why 3080 12gb had better specs, overall, since it required for a higher bus width for the additional 2gb of VRAM.
@rangersmith4652
@rangersmith4652 13 часов назад
@@Fr33mx Yes, but all these GPUs are in development for years. There's plenty of time in that cycle to decide on how much VRAM a GPU is going to need for the performance level and to design the circuitry accordingly.
@genx156
@genx156 17 часов назад
I got the 3080TI 12G as at the time it was cheaper than trying to get a 3080 10G 🤔 Crazy 🤣
@geraldroof1481
@geraldroof1481 6 часов назад
Good Video, but I don’t agree with the conclusion. It would be crazy not to go for the 12gb version.
@filipshinigami7263
@filipshinigami7263 16 часов назад
You always drop a BANGER videos bro!
@xgearheart8592
@xgearheart8592 6 часов назад
Nvidia is gimping older cards intentionally. Fsr3 on a 1080ti works amazing. Thanks Amd 👍
@thewacokidd06
@thewacokidd06 16 часов назад
I wonder how the 12gb 3080 would compare to the RX 6800xt in modern titles, they used to go back and forth in non-RT situations
@ooringe378
@ooringe378 15 часов назад
People like to say AMD fine wine now its AMD fine milk, in recent titles RDNA2 has been underperfoming a tier or even two lower without hardware RT. Space Marines 2, God of War Ragnarok, Until Dawn, Silent Hill 2 etc...
@Jtretta
@Jtretta 15 часов назад
Looking back on generations prior to the 3080, it really ought to have had 16GB. The 680 had 2, the 780 had 3, the 980 had 4 and the 1080 had 8. You can also make the argument that the 2080 and the 4080 should have had 12GB and 20GB respectively. Edit: I am just looking at the trend of VRAM increases not the relative performance to VRAM or other factors as such.
@m8x425
@m8x425 7 часов назад
The GTX 780 was really part of the same generation as the 680 though. Nvidia's made so many strides with Kepler that they didn't feel the need to release the big boy GPU right away. Also, it doesn't help that their flagship Kepler GPU was not ready until the end of 2012.
@wolfwilkopter2231
@wolfwilkopter2231 18 часов назад
I have the RTX 3080 Aorus Master 12GB and i'm glad that i took this over an severly overpriced 10gig 3080 to replace my aging GTX1080 Aorus Rev.2. Got it for 936bucks, which is still quite alot but pretty much the lowest price this model ever had and the 12gigs are *barely* enuff for Cyberpunk with RT Ultra in QHD, but only unmodded, with extra stuff and better textures, the 12gigs wont be enuff anymore quickly. So my next card willbe one with a minimum of 1, but i'd rather wait for the 24Gbit DDR7 versions, so i can get 24GB.
@Gaphalor
@Gaphalor 17 часов назад
Ich hab 400 Euro für meine 7800xt 16gb gezahlt.
@wolfwilkopter2231
@wolfwilkopter2231 17 часов назад
@@Gaphalor Gut für dich! :) Allerdings war das wohl ein Sonderangebot oder "gebraucht sogut wie neu", denn aktuell kosten die nämlich sonst ab 470 aufwärts. Und die 7800XT ist eine Karte der aktuellen Generation, ohne die Nachwehen von Crypro und co und meine 3080 12Gig zudem ein Custom Modell der Spitzenklasse, was den Preis nochmal ein gutes Stück höher treibt, als notwendig wäre für ein eher Basismodell, wie die Gaming OC die stellenweise nur 760Euronen kostete. Daher hinkt der Vergleich doch sehr. Fakt ist zudem, in der optimalen Auflösung von QHD sind beide Karten in etwa gleich stark, alledings hat die 3080 Features die der 7800XT abgehen und im RayTacing ist die 3080 immernoch ungeschlagen und lässt die neuere Karte problemlos hinter sich. :)
@ratlingzombie8705
@ratlingzombie8705 16 часов назад
Hab meine RTX 3080 12gb für 350€ mit immer noch 1,5 Jahre Garantie bekommen. Der Typ hatte sich kurz nach dem Kauf eine Rtx 4070S gegönnt und anscheinend kurz danach eine Rtx 4070S ti 😂 die 4070S hat er mir auch angeboten (für deutlich mehr Geld)
@Gaphalor
@Gaphalor 14 часов назад
@@wolfwilkopter2231 Sie war neu, verschweißt in OVP, war ein Fehlkauf von jemand und hat sie günstig abgegeben. Mein Glück. Zum Thema Raytracing: benutze ich nicht und die 3 Spiele in denen man wirklich einen Unterschied sieht spiel ich auch nicht wirklich so oft. Das einzige Raytracing das was taugt ist Pathtracing und da brauchts wohl noch 1-2 Generationen von Grafikkarten bis das Mainstream wird.
@wolfwilkopter2231
@wolfwilkopter2231 14 часов назад
@@Gaphalor Jeder wie er meint und ja, da haste Glück gehabt. :) Ich nutze RT überall wo es machbar ist ohne das ich dabei den Diashowtod sterben muss und der Unterschied kann teils brachial sein. Kommt allerdings auch darauf an, wieviel Zeit, Mühe und Technik die Leute noch ins rastern stecken und da alles zusammenpasst... Wenn ich da Jedi Survivor denke, da war RT ja quasi Pflicht, wenn es auch nur ansatzweise gut aussehen sollte und Witcher 3 sieht mit RT auch nochmal um einiges besser und natürlicher aus, DD2 ist eigentlich auch mehr auf RT ausgelegt etc pp. Und mit dem DLSS und FSR 3.1 FG Mod, kann ich sogar CP2077 in QHD mit PT spielen, nicht optimal, aber es geht, aber dafür kommen ja bald die 5000er, dann läuft das auch so.^^
@Drewkungfoo
@Drewkungfoo 15 часов назад
I never counted it as part of the 30 series when I have the 3080Ti with 12GB
@h.barkas1571
@h.barkas1571 19 часов назад
Tired of feeling inadequate with my Aorus Master 3080 10GB I bought Asus TUF 4090.
@ConfederateBanshee
@ConfederateBanshee 18 часов назад
You're about to feel inadequate again, since the 5090 is about to release in January. Also, one day soon, people are going to be talking about that 4090 as you are now with the 3080. 😅 The 3080 is a beast that deserves respect.
@h.barkas1571
@h.barkas1571 15 часов назад
@@ConfederateBanshee I didn't mean any disrespect to the 3080 far from it. My Gigabyte Aorus is probably one of the best specced cards around with 3 HDMI and 3 display ports and a little LCD screen in the fan shroud to display system info or play a custom gif. In fact I feel bad selling the GPU as it's a bit like selling a family member 😉but I got an offer for the 4090 I couldn't refuse (30% off the going price for a four months old card in mint condition). The drop of the 5090 coming January won't affect me much as I'm using a 65" 120 Hz 4K tv as monitor and wouldn't notice the benefit of a faster card.
@ConfederateBanshee
@ConfederateBanshee 15 часов назад
@@h.barkas1571 That is kinda a snag at 30% off. I hope you're having some wild gaming experiences with that beast of a card! 😉
@HarimeNuiChan
@HarimeNuiChan 19 часов назад
No the rtx 3080 was pretty bad priced and not aged as good. For the 3070 ti price in Japan I bought my rx-6900xt and with today's drivers I outperform my sisters rtx 3090 in most games . And even for me 16gb is barely enough for my games so 10 and 12 GB is nothing
@nimaarg3066
@nimaarg3066 14 часов назад
AC shadows uses Anvil not Snowdrop.
@napowolf
@napowolf 15 часов назад
In terms of performance, VRAM really doesn't matter that much as shown in the vid. Barring a few outliers, mainly super intensive new AAA games, GPUs are almost always limited by its rasterization pwoer before VRAM become an issue. Even with RT, which requires DLSS to be playable, VRAM makes little difference after DLSS is applied. The real problem imo isn't that X amount of vram is not enough for gaming. It's always a few outlier games that hogs memory at 1440p+, Ultra quality with RT that calls for higher than 10-12GB, and not many can afford to play at those settings to begin with. However, NVIDIA gimping vram on more and more pwoerful cards while jacking up the price for their entire product stack is truly unacceptable. Lowering setting is fine, but I shouldn't have to compromise on settings with new hardware that's ever increasing in price. That's why I'm sitting on a Titan Xp, runs everything I play just fine at 1080p. New games mostly suck nowadays anyways. I would only consider upgrading at the tail end of a generation when there's enough of a price drop.
@mrhappy8966
@mrhappy8966 8 часов назад
This is why in 2020 when i saw the rx 6800 16gb i said yepp that's the one, now today i own it and its perfect, even do ultra rt on cp 2077 with fsr 3 frame gen @ 1440p high settings.
@ooringe378
@ooringe378 15 часов назад
Framegen can also be a bit of a memory hog and probably would've allowed the 12GB model to stretch its legs more over the 10GB card with all the bells and whistles going.
@censoredialogue
@censoredialogue 12 часов назад
also i feel like something is off about these benchmarks. makes no sense that the 10 GB card is doing better when the 12 GB should perform between a 10 GB and a Ti
@laurenceoliveranderson9401
@laurenceoliveranderson9401 17 часов назад
Was looking at 3080s, decided to go with a 6800xt instead. I miss dlss but hopefully fsr4 will make up for it
@Uthleber
@Uthleber 10 часов назад
16 GB, better temps, less power cons, fsr Q on 4k is fine
@TehOnionGod
@TehOnionGod 17 часов назад
Might as well just buy a new 7800 XT for around the same price.
@alumlovescake
@alumlovescake 13 часов назад
The 3080 at launch was the only good deal but games these days even at 1440p struggle with only 10gb and honestly sometimes 12. Mid end Cards should really have 16gb minimum these days
@Ebilcake
@Ebilcake 13 часов назад
The 10GB 3080 is still fine today, DLSS will keep it going for a lot longer than NVIDIA probably intended. Also nice you can hack in FSR FG quite easily in many DLSS FG games and it works extremely well. So yes, despite the naysayers, it's held up just fine and was a great deal if you picked up one for retail.
@CelMaiIubitDintrePamanteni
@CelMaiIubitDintrePamanteni 13 часов назад
Please do rtx 2080 TI vs rtx 3070 to see if the extra vram helps
@Gr3gl_
@Gr3gl_ 13 часов назад
10 GB 3080 haters when I turn the textures down from 16k to a measly 8k and suddenly 10gb is enough
@WurstSammler
@WurstSammler 17 часов назад
I just recently upgraded from a 3060 ti to a 3080 10GB for like 350€. Made all the games I play playable in 1440p and High settings. Super happy with it.
@snoweh1
@snoweh1 17 часов назад
I've had people with 3060 12gb tell me it's better than 3080 10gb despite getting half the framerate, simply because it has more vram. Some people are very delusional.
@cxngo8124
@cxngo8124 16 часов назад
​@@snoweh1they are right in 2032 when the vram is not enough and one crd gets 1 fps and the other 0.
@distrusts
@distrusts 15 часов назад
​@@cxngo8124 at that point the 3080/3060 ti won't get 30 fps at 1080p anyway so it won't matter.
@snoweh1
@snoweh1 5 часов назад
@@cxngo8124 I wonder what things will be like in 8 years. Games will look the same as they do now but run 4 times worse. Just like 8 years ago vs now.
@tourmaline07
@tourmaline07 16 часов назад
I wanted a 12GB 3080 a couple of years ago upgrading from a Vega 64 , but it was way too expensive and I had to settle for a 2080ti instead and clock the balls out of it.
@HartFalleon
@HartFalleon 15 часов назад
At this point I'm looking forward to getting my PS5 Pro and not worrying about this stuff.
Далее
10 Things AAA Games NEED TO BRING BACK
28:32
Просмотров 192 тыс.
The Greatest GPU of All Time | GOAT Project Finale
19:38
ТЕСЛА КИБЕРТРАК x WYLSACOM / РАЗГОН
1:40:47
The DEATH of BioWare? (Dragon Age: The Veilguard)
17:13
Sorry, We Need To Talk About Thomas Tuchel
33:01
Просмотров 123 тыс.
The Strongest Militia in the Middle East
13:49
Просмотров 1,2 млн