Yes it is the 6800 or the 6800XT are really good cards still. I have a 3080 & my mate has the 6800XT, i don't see a different in fps in most games but they do run hot just so you know@@kR0M1337
@@hussenmazen8036 I am very happy now thank you for the compliment. I replied earlier the comment doesn’t show. The price depends on what you want to play at 1440. I have a 6650 and 6600 xt for most games and they both can run max settings on most games you can buy either for under 400 dollars. The expensive pc was a project I built for special occasions such as SQ42 Starfield etc games what both need and deserve a high end pc. My gpu was at the time of purchase rated the best 3090 It’s a rog strix oc white edition because of that I was happy to pay the listed price. I changed some parts over the last year and added LED e-tickets etc. I could in theory build a pc right now with the same specs for 4000. The pc was a labour of love for sure in my eyes it’s finally perfect after some tweaks. specs are ASUS Hyperion case z690 msi Ace 13700kf and contact frame, 64gb ddr5 6400 1200w psu 420aio h170i and 4tb m2 SSD gen 4.
@@Fredrik7le And I remember being the only one in my class who could run Half-Life because I had a Riva TNT2. I remember that I hesitated between the Riva and the Voodoo2, and then just a few months after I bought the PC, Riva's successor appeared - a technological marvel called the GeForce 256. It's amazing how time flies🥺
@@nikolabozinovski2041 bro im still on the same card, 1050 Ti. I'm planning to upgrade my CPU soon though, from i5-9400F to Ryzen 5 5600x. Do you have any GPU upgrade recommendations that won't bottleneck the 5600x?
@@peaky6622 a 1050 ti won't necessarily bottleneck the 5600x, but it's a powerful processor that can be paired with cards like the 6800xt and 4070 and not bottleneck them, so chose whatever you can afford
This. games in high and ultra graphics settings have almost no difference, i can play any game in 1440p with my RTX 3070 smoothly just by adjusting one or two graphics settings.
@@nickolasadame1939 No, he hates gpus with low amounts of vram since he doesnt believe anything below 16 gb of vram will be future proof for higher resolutions which is reasonable since newer games require more and more vram
Really we need to push microsoft to release a new direct x 12 or direct x13, the newest one (direct x 12 ultimate) is so unoptimized for gpus, all of your utilization goes onto the cpu and vram, producing less frames and increasinf vram usage, alone with stressing the cpu which is usually already processing tons of other services while you're gaming. If microsoft can get direct x under control, then we wont have to worry about future proofing as much, and sure as hell wont consider 12 gbs of vram to small. Realistically, 8gb should exceed the expectations for qhd gaming, 16 gbs for gaming cards shouldn't be a concern.... especially at the prices were looking at currently....
@@JahonCross no the 7800XT, the 6800 xt is still a really good option though. It’s just that usually newer gpus have worse price to performance compared to last gen, but actually AMD priced the 7800XT at almost the exact same as a 6800XT so it’s actually worth it to buy
@@roogzGames are getting insane, back in those days a top of the line rig got like 300+ fps. Now even if you spend like 3k for a desktop 4k ray tracing ultra settings will keep you under 100fps
@@menace2society00 over generations I've started buying highend-ish GPU until gtx2000 series inflation I just can't afford. I got my gtx1080 for 800aud... now for that price you can only get 4070 (non-S) range which I consider mid.
I kinda disagree, the increased vram requirements are a consequence of the new gen consoles which aren't gonna be using more than 12gb as Vram so it should be fine till the next gen. As most games aren't gonna be including textures that the PS5 can't use. For example in 2014 everyone was freaking out that shadow of mordor required 6gb at ultra, people claimed that it would only go up and 8gb wouldn't be enough b4 long. But in reality 6gb was fine for the whole generation, as that was around the amount used by the PS4. So personally I wouldn't be freaking out about Vram, it's the kind of requirement that goes up gen on gen not year on year, so you should be fine with 12gbs. That being said if you're still using the 10gb 3080 I'd consider moving on, it might be ok but your pushing it.
Finally someone speaks the facts. I would also add the terrible optimization of games, and it all started with Cyberpunk, and lo and behold, that game is still talked about. In general, I think it's best to buy the generation of gpu that comes out right after the new consoles, because there's a good chance that the gpu will last the entire generation as well. I think that 3080is still very usable, at least until the 5000 gen
As i love and wish AMD the best, 4070super is a monster in 3d editing compared to AMD cards. Gaming is perfect probably on both, but 4070s render things 2x faster than 7800xt. If i was about gaming only i would buy AMD no second thoughts.
@@szaka9395The render thing is heavily dependent on the program. I challenged my buddy to render a 16K image on his rig in Blender (5700x + 4070 Super) and it took his rig the exact same total time as my rig (2700x + 6800xt). We used the same Blender file with the same render settings. His render finished in 23 minutes, the denoising took 4 minutes. My render finished in 20 minutes, and the denoising took 7.
Great comment super helpful! It does look like new gen consoles will be here pretty soon though. Probably 2026 for the New Xbox, and the PS5 pro may be even sooner
honestly the 7800xt is definitely on the high end for 1440p. i got my girl a 6750xt for 4k gaming and it does just fine. unless you are trying to get 100+fps minimum there is no reason you cant go for a cheaper card that has 16gb vram.
@@stevieC11Hanworth considering i have a qn85a 4k is my only option. anything less is blurry. currently i have a 6800xt and i get more than enough fps on high/ultra. plus amd just came out with frame generation and in combination with rsr/fsr i can play any game at 4k with over 100fps. honestly though, i dont use frame gen or rsr. if i can play at native i will. frame gen doesnt allow vsync or any alternatives and i do get screen tearing and no point using rsr if im already getting 60fps+ anyway. fsr/rsr does have really good anti aliasing implementation though. much better than the industry standard TAA now a days that has terrible ghosting issues so in a game like banishers i used fsr simply because it looks miles better than TAA.
The 6700XT is still the best entry-level 1440p card (in my opinion), as it can be found at a steady $250 used, in some cases I've seen it drop to $220 which is honestly a huge W.
Not just your opinion it still is if a base Rx 6700 can do med high ( sometimes just high ) settings 60fps on 1440p on intensive games the Rx 6700XT/50XT can do better
couldn't agree more. or the 6800 non xt for the right price is unbelievable. went from an asrock 6800 toa 7900xt reference and was honestly disappointed in real world gaming performance differences at 1440p 144hz
I bought my 7900xtx for around $800 pre-owned. Listen when i say this thing is amazing, on 4k maxed settings im still pulling ~140 fps on AAA games. I believe it is currently ranked as the 3rd best gpu at the time of writing.
Bought and returned the xtx when I saw how bad it was for RT and could not keep up with 4080. XTX is a great card for raster and some games, but not all games.
@@carrickdubya4765 amd doesnt primarily focus on raytracing, its a given that the 4080 would be superior in that matter. However if you want to go that route, i would suggest waiting until a breakthrough comes out with raytracing technology, because right now its just a waste of cash if you are buying a card for better RT.
@@carrickdubya4765 I bought the 7900 xtx sapphire nitro+ model myself and mine didn't even work properly lol. I got in one hour of game time and it crashed, then I couldn't launch any game ever without an immediate crash. My 6800 xt was better. Immediately returned it and got a rog strix oc 4080 and am blown away by just how much better it is than the 7900 xtx
They have launched. I own one for awhile now, they're great and all but also expensive. Was considering AMD. But since my 5700 and high temps on a 5700xt that I've placed an AIO on I didn't really want to risk AMD and their high heat but for the price to performance I've seen on them they're great cards as well
@@waynemarsh7775 depends on the board partner - had a sapphire 5700XT nitro+ and temps were just fine, switched to a sapphire 7900gre a few weeks ago and that card goes barely beyond 70°C even when playing for hours - all with air cooling.
Just picked up the 7800 XT Sapphire nitro + a few weeks ago and love it so far! Paired it with a 7800X3D and have no issues running games on high settings in 1440 p
I bought a 7900xtx tuf oc and a 7950x3d. I got both them for around the same price the 4080oc were going for at the time. Both came with a code for frontiers of pandora too.
@@Bigjohn2121 aye that’s awesome. I got a frontiers code as well from the Newegg sale they were doing, so a nice bonus for buying the card. The 7000xt series cards are just so good for the price, you can’t go wrong. Hope you’ve been enjoying it so far! I haven’t gotten to play Pandora yet but I’ve heard it’s a decent game so I’ll give it a chance at some point
@@Rolfhn hi, sorry I didn’t see this earlier. So according to the TriXX software right now I’m running Jedi survivor on high settings and it’s pulling about 250 watts of power. I went with the Corsair SF1000L power supply. It’s a smaller power supply in terms of size because I built in the Lian Li O11 mini case so I needed a smaller psu. I bet if you went 850 watts tho instead of 1000 you’d be totally fine, I just went a bit over to be safe and the price difference was negligible. I hope this helps!
Radeon has also the 7900 GRE .... you could not overclock it .... but with the recent update you CAN overclock. And is only 10% more expensive than the 7800 XT.
you could overclock it from the beginning, just not the vram where it mattered the most with a vram bandwith starved card like this. But yeah, now you can and that's good ;)
a 3060 12gb is still getting beat by a 3060 ti with 8gb if the gpu don't have the power in the future when the vram is needed it wont matter how much vram it has the 4080 super will still beat a xtx with only 16 gb of vram. vram isn't the only factor with gaming the power of the card means alot to. Look at the 3090 it has 24 gb and is getting the same performance as a 4070 ti right now and the 4070 ti super will beat a 3090 ti with only 16 gb and the 4070 super 12 gb will be faster than a 3090. so having more vram don't always mean better. in the long term to many people forget the power of the card makes a big difference.
It's not actually about performance it's mainly about the stutters if you pass you VRAM limit you might get some small stutters here and there and that can be kinda annoying but ye VRAM really doesn't matter
@@davidfrazier6308 no it isn't, you can optimize as much as you want but the higher fidelity the textures are and the more rays are rendered, the more vram they need, there's no way around it.
I'm happy with my RX 6800 which I bought used last year ex mining card. Upgraded it from my old 1080ti which I gave to my dad for his AUTOCAD 3D rendering.
I got told I was being dramatic when I bought my 7900 XTX 7 months ago for basically this reason, it's nice to hear others acknowledging the absurdity of what we need to consider for future proofing right now
And the only reason is, because the big AAA companies cant be asked to properly optimise their games. There is no way, you should be having to build a 2-3k gaming PC, just to play 1440p...
@roarbahamut9866 oh I agree, but yet here we are in a world that is happy to see an end to native optimization in favor of frame generation and whatever other shortcuts can be jammed into the games being produced. At least I'm extra prepped in case this blows up in their face within the next year or 2 and we go back to normal expectations
@@sibbeplaystv3636 srry I didn’t see your comment, it has no issue with either of those games, although it’s not the best at lumen and rt on Fortnite other than those settings it gets really good fps
@@sibbeplaystv3636 not it doesn’t have issues with either titles but with rt, lumen and max everything on Fortnite you’ll only get like 80fps or so, otherwise expect 240+
6950xt is actually the same price as 7800xt at microcenter right now. It has extra performance and the same vram, if you don't mind going last gen. I picked it up for my new PC at $550 to go with my ryzen 7 7800x3d.
GPU manufacturers have a challenge to overcome with the vram. Vram apparently gets super darn hot if clumped together. My 3090 has all of the 24gb vram clumped near the back of the case (at display ports etc.) and only that part of the card get's hot af
If you want to stay on Team Green for NVIDIA DLSS, low latency, etc get the RTX 4070 Ti Super. It will have 16GB of VRAM and more cuda cores and more performance than the current 4070ti, all at the same price of $800 The RTX 4070 Ti Super is coming on Jan. 24th.
@@mohannadalmalki3986 i mean what new features are there really, amd has made it clear they have no intention of restricting most features to new hardware and they also made it clear. They have no intention to stop supporting this 6000 series anytime soon considering they still make drivers for 500 series cards I’m not worried about the driver longevity for a considerable time in the amount of time that it would take for them to stop supporting there will be so many graphics card releases, and with thst much better options
@@mohannadalmalki3986Although true, the 6000 series likely won’t be phased out for a while, especially cards like the 6700xt and up that are meant to be able to handle 1440p, let alone cards like the 6950XT which on the gpu hierarchy is right up there with 3090’s and their TI variants, and there’s no way Nvidia is dropping support for 30 series anytime soon. So I don’t see AMD making that move either… as it’d simply make no sense when competing against Nvidia
@@mohannadalmalki3986 Driver support will be about the same as the architectual differences are minimal - just like the late GCN cards like the 500 an vega series got axed at the same time.
I think 12gb vram should be perfectly fine for a few more years. Games usually have plenty of graphics settings to adjust to how much vram you have. I think the 4070 super is a good priced card for the performance + it’s capable of dlss
The 7600XT has the same GPU die and cores as the 7600, it's very likely to perform very similar to it. Definitely won't outperform the similarly priced 6700XT.
I’m using a 3080ti for 4k and it’s been working beautifully. You don’t need a 4070ti for 1440p. You can get away with older cheaper cards that will give you the same experience.
@@Scurzes 1440p low. If you have a high refresh rate 1440p monitor, there are some excellent deals on 3070tis on the used market if you’re willing to go that route
@eljakamoo found a 3080 ti suprim x 12g the other day. Heard it benched higher than the sellers friends 3090. And it was for $595. I'll probably look into a 3070 ti saw many that were decent prices
@@Scurzes if you can get a 3080ti, I’d highly recommend it. I have the msi ventus card and it’s never given me issues. It performs great at 4k. I’ve been loving it
Basically if you're getting new and in most cases used AMD low end, mid range and high end. Nvida is only good at top of the line with rtx 4090 being the only choice in that category...
Ryzen R5 7600 is really your best option. More than fast enough for the rtx 3060, is affordable and comes with a stock cooler, and will future proof your build because you'll have a AM5 motherboard.
Hope ,The gaming industry and gamer community may show some love to intel GPUS ...the A770 was offering 16 gigs of VRAM at half the price of RTX 3060 ti and performance considerably stable... But it'll take at least year for others to adapt with new contender in GPU markets...
I personally have a rtx 4070 for my 1440p setup and i cannot complain. I can run stuff like Cyberpunk with maxed out settings (no path tracing tho) and get a stable 100-120 fps.
I'd honestly recommend an 8gb GPU (on the used market anyway). It's not that the low vram isn't a big issue, or that it won't hurt the future-proofing, it's just that the market already bakes this into the price, and imo, takes it into account wayyy too much considering the vast majority of games are fine with 8gb. You'd have to pay such a premium for no immediate jump in performance, and you can always turn textures down a notch if a game demands too much ram.
Guys VRAM only matters if you play on High to ultra settings if you play on medium to high settings s you should be fine unless the new games that comes out literally takes 12 GB of VRAM just to run medium settings (sadly the gaming world might come to that 😢)
@@luisfernandohernandezdiaz6831 upscaling is not a fix, it's just a workaround. Yes it's fine for now, but going native will always be better if your recources allow it.
i only have 4gb vram ;O. i mainly play on my ps5 and only games like LoL on my pc, so it doesnt really matter right now. but still, its interesting how much power games require nowadays.
Just bought the 4060ti 16gb and apart from the hate on the internet I'm quite satisfied with the card. I shifted from 1050ti which was a dramatical jump for me. Also for productivity and 3d rendering, cant complain nVidea as well my 16GB of vram.
I modded my 3070 after watching that one video of the guy doing it. Was able to attach make a total of 22 gigs doing some sketchy work. Works great though!
I’d get a 3060 for tripple A games since its a cheap card now at 300 or below and it’s consistently good and can handle some games right now for 4k but idk about future planning in like 4 years time with gta6 and new open world games like far cry as well
If you're in the PC Gaming arena. You've got to recoup the value of your used GPU towards a new one. I went from a 1070 Ti, to a 3070, returned that, got the 4070 Ti, sold that, got a 4080. All cards except the 4070 Ti were used. I play @1440p 144Hz and have no plans of doing 4K on a PC monitor.
You gotta remember that high VRAM use is almost always due to ultra settings defaulting to using the highest quality textures. If you're fine with playing on High presets or messing around to make your own custom settings, Vram use shouldn't be a problem for you.
Rx6800 is around £350 and performs perfectly fine in my build for 1440p. I run most games on high settings and getting max frames on my 144hz monitor. It also has 16gb vram for a 2 year old card
yea, my old pc from 6 years ago had like 4GB of vram, and kept on running into issues trying to play AAA games, so i went yolo to get a 20GB VRAM 3080ti from evga
Think about it like this: when i got my 1070 people told me i was wasting money if i didnt game in 4k when i played in 1080p. I play modern games at 1080p and im getting somewhat decent performance. Whatever you choose, know that its a matter of time.
Got my GTX 3060 TI off of eBay and I have a i5 11400F and my games new and old run so much better. I could even run them at their highest settings at 1440p at over 100 FPS. It's very smooth.
I’m running an R9 280X 3GB OC and can play most older titles. Hell I was able to play Elden Ring @ 1080p with very little hiccups. It’s crazy how 8GB video cards are now seen as the minimum. I’ve got some serious upgrading to do. 😅😂😂
The one you can afford. Personally I think 1440p is a sharp enough resolution for a 27 inch desktop monitor. Even with a maxed budget I would still go 1440p. Only difference between cards now are the amount of upscaling you need to use so as long as it has enough vram you are probably fine
I’m still running rtx 3070 with 8gb vram. Most of the time nowadays high and very high texture is close to no difference and just stick with high and you don’t have to set everything to max and use dlss. I have never not encounter any vram issues since I got this card. Though yes I agree haven’t more vram is desirable but imo it’s really not that bad as most RU-vidr sounds 😞
My 4070TI has no problem playing cyberpunk 2077 on 4K resolution with add 120 frames per second everything maxed out. Let alone any other game you can only imagine how much better it
Sooner or later, a typical desktop PC setup will have TWO towers: the CPU tower and the GPU tower. At least once we have the GPU tower, we'll be able to slot in a stick or two of RAM to upgrade our V-RAM.
My rx6800xt is still an absolute monster at 1440P paired with a ryzen 5600x with an undervolt it pulls 70 less watts of power and if your into optimizing your graphics settings it’ll play damn near any game 100fps+ no trouble at all. Halo infinite I’m locked at 165fps with mixed settings Fortnite with the nanite geometry 120fps+ apex legends max settings not a sweat 👌
Most people have 8gb or less vram and are running at 1080p according to steams hardware survey, so if you’re just an average gamer, your 8GB card will still last you a good amount of time. Also, you don’t have to run max settings on every game, if a game runs sluggish on your gpu on max settings then just turn the settings down.
You could get a 4070ti super for about $800 and since it's built off an underclocked 4080ti it still has 16gb of vram and even has higher specs than most 4080s, on paper at least.