It's an evolution of the 6600 and will never be more than that (generational incremental gains) until they move to a fully chiplet design..... Was still a massive upgrade from my rx570, and cheaper than the 4060..... Whilst being only $10 more than a 6650 at the time due to discounts.
It never was a disappointment. Its a regular 7600 with 16gb vram and the performance is exactly like that. Problem is it should cost 50 euro less to be competitive.
In today's market, 8GB of VRAM costs around $30. Nvidia charges $70 more for the 4060 Ti 16GB. You may not see much difference on the frame rate counter though... So why is the 4070 Ti Super constantly using 10-12GB even at 1080p? *Does the 4060 do some magic to not need more VRAM?* That MAGIC is called worse textures and Low Draw Distance. It's literally lowering your settings without your permission. It's like saying you can make a more powerful GPU just by lowering the settings. Most people buy a GPU for 4-6 years. Even in 3 years every new game (even COD) will use more than 8GB. Neither card is great, but I would never buy an 8GB GPU for $300. Especially if I could bought one 8 years ago for only $200.
To be honest, it is the first time I hear about Draw Distance problem caused by insufficient VRAM capacity. It is usually CPU related setting and it does not rely on VRAM that much, unlike textures. P. S. 4090 AND 7900 XTX may allocate up to 20GB in some games, but it does not mean games really need as much VRAM. It is more complicated than that. Cards with more VRAM often allocate more VRAM as well, just to be on a safe side. But 8GB buffers are showing their limits already, that is true. Though it is not that important in case of low end GPUs. 4060 Ti 8GB, on the other hand, is an annoying product, since it could surely make use of more than 8 gigs in some cases.
Love how there's absolutely no proof to support your claim other than 1 broken game like forspoken that downgrades to lower textures and maybe re4 on launch day which is absolutely fixed now. FYI there's a difference between allocated vram and actual usage. Yeah more vram can be allocated if there is more and that's what shows up on rivatuner stats. Sure 8gb is not enough for that 1 or 2 broken games and maybe another couple heavy ones at max settings but other than that a 4060+DLSS/DLAA+framegen is so much more performance than AMD. I will never go back to using the blurry vomit that is TAA after using DLAA.
@@stangamer1151 It's a CPU related issue only as long as you have enough VRAM. Textures cost (almost) no GPU power. It only eats more VRAM. Do you understand that it's the textures that aren't rendering correctly? Not physics or anything CPU heavy. Look at the Hardware Unboxed videos.
I feel like neither GPU is particularly great for the price, personally. 6700 XT is about $340 new, and $250-275 used - a good 1440p GPU. RX 6600 is about $200 new, and $150-165 used - a good budget 1080p GPU.
I got a used 6700XT in August for ~$245. I saved $100/$150 by buying used (cheapest/my model), but the market is so fked up that I don't think of it as a great deal, but a sane deal, instead. It gets me 1440p 60-120fps in everything I (rarely) play, but I agree: from now on, only buy this for 1080p. Also, it's been @$320 since 2022. It's been 1y+ and it sells for the same money. It's an absolute dumpster fire of a market. DON'T BUY NEW CARDS at current prices! Go used or wait for the next gen and make a decision then.
The 7700xt has already reached the $329 price point on sale a couple of times. Making not just Nvidia's offerings look really bad but also AMD's own offerings below the 7700xt.
That was likely a mistake and it was sold in very limited numbers by retailers who likely didn't want to anger their customers. It doesn't make any sense for them to lower them to such a low price without at least trying to sell them at say, 400 or 380 first. Let's be real, the 7700 XT would be a good value and would sell very well even at 400, though personally I'd like to see it go to 380 before it becomes a #1 recommendation. RX 6800s have gone as low as about 370 or 380 on discounts.
Maybe you're thinking of the 6700xt ? I've never seen 7700xt anywhere close to 329!! If it was (over the past ~8 months I've been watching prices) I would have snapped it up immediately. (I ended up settling for a 6700xt when it went on offer)
@@BrophyMichael I was wrong about price. It was $353. Two different models of PowerColor cards have sold in the last couple of weeks through newegg on the $350 range.
I like that you pull back from not using only ultra settings in every benchmark because real world settings are a far better measure for real world performance. It's a ton of work and we appreciate it! Anyone that automatically cranks settings to ultra has fallen for the marketing that tells you you _need_ a new GPU every year. Especially if they're doing it on $300 cards. Ultra often looks no better than medium, let alone high, and it's most often just a massive waste of resources and performance. We've been tuning settings for performance for 3 decades and I don't know where so many people started thinking if you can't play at Ultra it's not good enough. It's shocking how many times I've heard people say Cyberpunk is a useless tech demo because it's unplayable on anything less than a 4090. Which is so far from the truth it's laughable.
I really think its part of the consume culture, 4060 its a highly controversial GPU because its not the best if you are changing your graphics every time NVIDIA drops something, but if you are upgrading from an old GPU, like series RX5xx or GTX10xx, 20xx getting a 4060 maybe its the best for you in terms of what you get for the price
I first played CP2077 V1.55 on a Ryzen 3600X, Rtx 2060 Super, with 16 gigs of Ram. 1080P, Low RT... or 1440P, no RT, but ultra settings. Stable FPS in the 55 to 70 range. I don't much play competition shooters, so outrageously high frame rates are wasted on me. I need smooth and pretty. In any case, i absolutely agree with you. But to be honest, I find that my real world experiences are almost always different than what i see in these comparison and review videos. My main rig now is an R5500X, RTX 4060, 64 gigs RAM. I bought it new on Amazon about six months ago for $900-ish. Most of the time I play CP2077PL at 1080P, RT Med. I get around 70 FPS. Of the realistic option available today, I'd go for a 6700XT. My next card will likely be something like either a 4070 Super or a 4080. I'd like to achieve stable 1440 play with Ray Tracing. Maybe middle of next year.
@@SamlovesLulu My first playthrough was on a 10700k with a 1080ti. Very playable once you aren't using raytracing. Now that I'm playing phantom liberty I have a 14700k with a 3080ti and I have to say it's obviously prettier using path tracing, DLSS, and frame generation, but it doesn't really make the game any "better" and you quickly forget about graphics and performance when you're immersed in the game. I just don't want FSR shimmering and blurring that quickly pulls you _out_ of that immersion. But more than that the reason I refuse to touch Radeon for the foreseeable future is its terrible VR performance. I got a 5700xt for my kids to replace a 1660 super and VR is a vomit inducing experience due to 1% lows that make you sick. AMD has been aware of it for years and just ignore it instead of trying to fix it. Theoretically it's Nvidia's superior codec shoving _two_ high definition streams down one HDMI cable cause most VR games aren't demanding whatsoever.
It's funny that Starfield has become a staple of all the gpu benchmark reviews and the game is already dead, the steam player count has dropped by 97% and it can't even crack steam's top 100 anymore.
If one MUST choose a GPU from the comparison, I would certainly lean more towards the 7600XT. At this price point, I would personally never use RT and these are clearly 1080p cards and should be run at those settings natively. Having said that, buying a used GPU would be a much better option in this price range.
16gb is wasted on the 7600 xt, 12gb would've been plenty, with a price below $300 it would've been competitive. Cant see any scenario in games where we'll be using 16gb with such a weak card.
They can't use 12GB because of 128 bit memory bus. Only 8 or 16. I know they should made 12GB instead. They thought they could get away with it and can sell useless GPUs for $300.
I wonder why you aren't mentioning the power consumption. Did I miss it?. I'd go for 6700 xt, but if I had to choose one of both, I'd go for the 4060 because of the much less power consumption.
Yesterday I was testing DLSS and FSR image quality on Alan Wake 2 and the difference of image quality was huge. I was having better image quality in movement using DLSS balanced or even performance than using FSR quality. FSR had too much shimmering and DLSS image was way clearer and sharper. But the major differences appears when using Ray Reconstruction, man, it's insane how I could use DLSS performance at 1440p with path tracing and high framerates with way better image quality than if I used FSR quality.
I have both Nvidia and AMD cards as well, and that's one of the main reasons I stuck with Nvidia, in about 50% of the games, I actually prefer DLSS quality to native. Also don't forget DLAA, when you can afford to use DLAA with a good fps, it's so nice
It's a Nvidia sponsored title what do you expect? And yes dlss is most of the times better but AMD cards are also most of the times faster and have more VRAM for the same price or even cheaper so it literally doesn't matter. Many times you have to use dlss on the Nvidia card to just catch the AMD card's native performance. Daniel has shown this many times. Ofc if you're a tech demo junkie you might wanna spend more for Nvidia's "features" but that's personal preferences and also depends on your region's pricing. Hence both have their use cases, people shouldn't be crying about dlss in every comment section it's gotten really pathetic at this point.
@@yellowflash511 here in Brazil the only AMD cards that sell really well are the older and cheaper ones, like RX 580 and RX 6600 from AliExpress. The reason why is because AMD importers buy small quantities and the prices are higher than the NVIDIA cards. For example, the cheaper RX 7900 XT are costing about R$ 5500,00 (U$ 1100~) and the RTX 4070 Ti R$ 5000,00 (U$ 1000~), so nobody will buy any high end AMD cards here.
Yeah I was doing some comparisons in Remnant 2 between DLSS, FSR and XeSS. Also using a 1440p monitor. It basically confirmed that when I upgrade my GPU I'm going to be sticking with Nvidia. I do wonder about using XeSS on an AMD card though.
@@yellowflash511 sponsorship doesn't mean anything to upscaling quality, it's the same driver. Starfield is a AMD sponsored title, launched with only FSR and people were still modding to add DLSS to the game, and was producing better image quality, which eventually forced the devs to officially add DLSS support.
The problem isn't with games that are made to run within 8Gb of VRAM The problem is with broken new games, ported from consoles. If you have an 8Gb VRAM card, you're going to have to wait 6 months, or 1 year to be able to play the game, because the game simply breaks. So. All your friends with 12Gb and 16Gb cards already played and finished the game, by the time a patch with a fix for 8Gb cards gets released. That's the problem. That's why the whole community is demanding more VRAM. Also. If there's a mod for game, and it allows you to High Quality Textures, your game will look amazing, on 10-12-14Gb textures compared to textures that have to fit in a 8Gb card. And textures don't impact performance in a powerful way. So ... your AMD card will deliver a much better experience, for the same price.
one trash gpu from nvidia versus one trash gpu from amd.. what a duel.. i remember getting Radeon 480 in late 2016 for something over 200Euro, what a great card... now we have this
@@TotoyMolangMarikina yeah atleast with both trash card the green card is more general for not just gaming only even though i agree 4060 or 7600 is bad but atleast not as trash as Radeon does
Have you tried turning the FSR sharpening all the way down to try to eliminate some of the "sizzle"? I've noticed it helping image quality in my opinion, but I would be curious on how it would compare to DLSS.
Daniel, AMD is already making killer margins on the RX 7600! The 7600 is about 15% smaller than the 6600, which AMD hasn't minded selling for $170-180; Companies are obviously making money at that price... The 7600 needs to be around $240 or less, and the 7600 XT definitely needs to be below $300! $230 and $290 would be exciting
The 4060 is even 30% smaller die and is using only 8GB of VRAM. The memory alone cost at least $30, but Nvidia make a huge margin out of it and selling it for $70 with the 4060 Ti 16GB. With this logic the 4060 should cost only $190. Not $300. And honestly, no 8GB card should cost more than $200.
@@JackJohnson-br4qrI agree, except that the nVidia cards are manufactured on a much more expensive manufacturing node (4nm), and AMD also gets better pricing from TSMC because of their loyalty... compared to nVidia, who jumped ship to Samsung when the opportunity presented itself. Come to think of it, I guess that is also a difference between the 6600 (7nm) vs the 7600 (5nm), so maybe AMD can't afford to sell them below $200. Also, nVidia no longer sees themselves as a hardware company (but rather a software company), and while I think their software adds value to most of their line up, when you get down to this segment, the software provides much less value because the hardware isn't powerful enough to really use most of it. I also agree that if you're buying an 8Gb graphics card, you ideally should probably just get the cheapest one you can get your hands on, or save up for a better card... better than the 7600 XT!
Just got a 7900xtx. Admittedly so I wouldn't have to use upscaling. But one has to wonder how many times they gonna hear Daniel say DLSS looks better than FSR before they actually do something about it? Coming from Nvidia and then Intel it's sorta shocking to see AMD has the slowest performance update response. Intel should be in charge.
12:03 and 30:00 are perfect examples of why you can't blindly cite _allocated_ memory vs _needed_ memory when fighting the vram boogieman. Many apps will gobble up far more vram if it's offered while not necessarily _needing_ it. It would be nice if more games offered up some kind of gauge of how much vram is actually needed as you tinker with settings.
Hey Daniel, love the videos... I was wondering when or if you’ll still be making your, “best gpu’s to buy” videos. I find those videos particularly helpful for choosing between various gpu options at a given price point.. Thanks for all you do, and keep up the great work!
I would definitely pay 30 more for the additional vram. Does this card benefit significantly from having 16GB rather than 12GB? Probably not very much. However, having more than 8 is very useful, and likely will become increasingly more useful over the next few years. Keep in mind also that some game engines will just stop loading textures once they run out of vram (using much lower resolution textures instead).
7600 xt only issue is 6700 xt. It's better than 7600 in 1440p It seems that it beats 4060 on 1440p but loses on 1080. If AMD bumps it down to 300$, it would be a great price. Thanks for the review keep up the good work 👍👍
The 4060 does cyberpunk path tracing 1080p DLSSQ + FG at 80fps, don't know why Daniel skipped that particular result. Great for 290$ product pulling 110W, on the other hand wouldn't touch a 330$ 180W card with zero RT capabilities with a ten foot pole.
@@Frozoken hey genius, even the non xt version is superior and is cheaper like 253.80 usd, in case you dont understand the extra vram's purpose and choose to only counting on the fps in that silly benchmark video.
And additional cost for upgrading PSU again? Naaah, no thanks. Consider thinking of your juice before jumping to the card because you might spend more than you calculated.
Or 6700xt lol That's exactly what I did - the price of the 6700xt (12GB) has come down so much it was actually cheaper than most (almost all!) of the 7600xt (16GB) in my region. (I got it for £330) Which I guess makes sense?? .. because of the extra 4GB of VRAM... But I don't think stock of 6700xt is gonna last very long - I just checked and pricing for it is shooting way back up into £400-£500 area, which is nuts!! If I was looking to buy now I'd actually get a 6800 (16GB) that's on offer on Amazon atm - XFX SWFT319 - RX6800 for £376
The 16gb on the 7600 XT probably wont come into play for another year/year and a half when games really start eating up vram. Then you'll see the 7600 XT start pulling ahead just like we saw with the 4060ti 16gb model.
So what GPU do you have? It's interesting to me that when the Nvidia 3000 series came out, no Nvidia user cared about the higher power consumption vs the AMD 6000 series. But now it's a first world problem.
@@JackJohnson-br4qrLol i heard about rtx 3000 power issues constantly and the other reason it was less one sided is because the 6800xt-6950xt were also pretty awful too.
@@JackJohnson-br4qrYes significant power draw differences with extremely high energy prices in a cost of living crisis is totally a 1st world problem. Especially for people only beeing able to afford these jokes of a card. Totally not u coping bcuz u like amd
The current more entry level GPU market ~$300USD is fairly garbage tbh for the vast majority of people 😢. However, as an ITX builder of well over a decade now, I’m used to paying a small premium for SFF components etc and just scored myself a NEW Gigabyte Low Profile RTX4060 for $295 USD🥰🤩🥳 (RRP/MSRP) plus taxes here in Australia 🇦🇺 and it retails for $330 USD. This GPU is going to be perfect for a SFF HTPC build allows for adequate gaming needs and home theatre uses for video playback etc is an ultra compact form factor 💪👍.
This is good advice. So silly because it's being used allegorically - but the RU-vid algorithm doesn't know the difference right. At least not yet, maybe when AI gets insanely better lol. He's right Daniel - probably best not to use the "un-alive myself" word
thanx for the great video .... I think dlss on quality with 1080p is good for small screens like laptops for example... I am planning to buy a 4060 laptop and I know I will use it ... what do you think.
hello, im wondering if the intel i7-8700K cpu would be able to handle this graphics card at 1080p gaming? I know its an old cpu but i really am at a tight budget right now and want to upgrade my gpu but not my cpu 🙏🏽😭
I still cant believe Fraud Howard called Starfield a "nex gen" pc game. It looks and runs like ass. Avatars Frontiers of Pandora makes Starfield look like a last gen game.
Personally I feel the use case on these low end amd cards is people press the auto overclock gpu and vram options I'd be curious to see how it compares and how stable it is more before I try upscaling
For me most important thing is TDP since I live off-grid which 4060Ti 16gb have no competition Power consumption/performance. Perfect card so far after my old 1050Ti
I never trust those deltas with regards to power for AMD cards. With my 6950xt something like starfield with high settings with frame gen is pulling 270w total system power from the wall. If I activate the delta it shoots up to 470w total system draw.
I'm looking to build my first PC, currently I'm looking at the Ryzen 5 7600 with either the RX 7600 XT or possibly the RX 7700 XT. Is the 7600 XT decent enough or should I consider the 7700 XT instead? I don't plan on playing high-end games, but I'd like to play games like Forza. I'm playing at 1080p for the time being, but may switch to 1440p later on.
@@dallasfrost1996 It is not necessary for 1080p but it is more future proof certainly with 192 mem bus and 12gb. It is 1440p card after all and could last maybe even 5 years without upgrade. 7600xt with 16gb has vram but gpu is just too slow for 1440p and those 16gb will be wasted because of slow gpu that can't run 1440p with decent fps. If u plan to switch to 1440p then 7700xt or even 7800xt or 7900gre if money is not big issue. If u are on a tight budget then 7800 with 8gb can work for 1080p but not maybe AAA games on ultra just because of vram. 7600xt 16gb is really hard to recommend cause price is too close to 7700xt and that is much better card. I am on similar dilemma cause i have 2060 6gb and will pick either cheapest option or will go for 7900gre just because not many good choices in between except 7700xt. Nvidia prices are just ludicrous for what they sell atm.
@@dallasfrost1996 I wrote longer reply but somehow it is gone.... 7700xt is 1440p card so for 1080p even 7600 with 8 gb is enough except for some AAA titles at ultra settings. But more future proof card is 7700xt if u switch to 1440p. In that case i would even recommend 7900gre if money is not tight cause that will last you at list 5.years of great 1440p gaming.
4060ti 16gb is the much better card, and you probably know that too. There are models at 420$ nowadays, if you can discount that and maybe fetch 380 or 400, theres no debate.
Hi I am planning to buy a GPU, Price of RX 7600XT 16gb and 4060ti 8gb is same in my area. Which one should I go for ? please advice, Thank you in advance
Testing these GPUs beyond 1080p is simply injustice considering traditional raster perf. While RT without FG, RR & Super Scaling put these GPU 720p at best.
Why 7600xt in this video has only 2230Mhz frequency when it suppose to be ~2450? And almost all games are stuttery mess. AMD rewrote drivers for dx11 (and dx12 is affected too if I remember right) and now it behaves differently, it compiles something that causes stutters. I saw on reddit post that you can disable dxnavi feature to return to old behaviour.
honestly for the $100 less might as well go with the 7600XT, love my asus dual card, i came from a 1070ti , perfect for 1080p 240hz , cs2,r6,valorant,OW2, apexs.
Instead of making a new SKU with a higher CU count like the 6700XT was to the 6700, AMD just pushed the power budget higher, making these new GPUs even less efficient. I think either they just like to shoot themselves in the foot, or they didn't want to flood the market with a bunch of new cards when the 6000 cards were still abundant. I think the 7600XT was just a way to not let Nvidia get all the attention with their super refresh
Please set the tessellation to 32X on AMD card since that's what NVIDIA are using, this is why NVIDIA hide (lock) their Tessellation setting from user.
I remember how many reviewers made the 4060 look like a trash when it came out. I was eager to get an AMD GPU for the pure rasterization so many people talked about. Now I have the money and I wil get the 4060, especialy for the temperatures and the power eficiency. Oh, and the way it handles the VRAM. I have an AMD CPU, the 5700x, and I really wanted an AMD GPU, but 7600 is an power hungry oven. Thank you, Daniel, for being one of the only honest reviewers on RU-vid! Much love from Romania!
I don't know what's wrong with the price of 7600XT in Canada. The cheapest 7600 is about CAD$360, but the cheapest 7600XT is over CAD$450? Well yes 7700XT goes up to $600, but still~
There's usually a more drastic difference in allocation than in usage, since 7600 XT has so much more VRAM to allocate. Also, Nvidia uses more aggressive memory compression algorithms, which does lead to less VRAM usage at the cost of higher CPU and system memory usage.
You sure the vram difference in the game like RE4 is really because of efficiency but not the hardware capacity of the mere 8 GB of vram from 4060? Maybe you forgot the super low-res texture pop-in when the game just launched and how turning on ray tracing will make the game crash in some 8 GB or even 12GB card. Sure, Nvidia do said that they will have some texture compression technology will be available in the future. But for now, it is just Capcom fixing their game and allowing more aggressive memory swapping between vram and dram. That’s why it used more systems memory on the 4060 than the 7600xt. The frequent memory swapping causes stuttering and you should see all the frame time spikes in 29:50.
@@lifemocker85 I have both 6600 and 4060, like both of them, so no fanboyism here. But DLSS do look better, FSR even on quality tend to add grainy afterimage effect on moving parts, and look worse on small linear elements, like hairs. DLSS is also not perfect, but has less artifacts in general.
Maybe if it comes down to 250 it would be a good buy. I grabbed a rx 7600 8gb for 250 in August since in my country the 7600 is around 70 to a 100 euros less than the 4060, so maybe in another country the 7600xt is cheaper. Overall if someone owns a card that can play most of the games that they want don' t upgrade and wait.
Amd being dumb again , all they need to do was sell rx7700xt for $380.00 and rx7800xt for $479.00 , from the start , rx7600xt sgould have never made or sold for $290.00 max
The 16GB versus 8GB vram advantage will just grow over the years. Think about next year when PS5 pro is released or like waking up in 2026. or 2027. with only 8GB vram. Same 8GB that some had with the RX 580 8GB back in 2017.
6700 XT wins, except in energy efficiency. At this level of GPU compute power, having more than 12GB of vram is not likely to be useful even a couple of years from now.
@@jorge69696At either RT settings or 4K. Neither of those GPUs aren't even close of capable of those settings just by raw performance so VRAM doesn't matter.
As noted in a footnote at the start of the benchmarks, my capture card does not have VRR enabled, and of course I don't enable vsync when benchmarking as that would limit performance. So this capture method is allows screen tearing. This would not be happening on a variable refresh rate display, so is not relevant.
i bought 4060 cause i needed a gpu after my 1070 broke and i was really disappointed by the preformance increase i figured 3 generations it would have to be way better but i was very wrong
That's the thing, it's not 3 generations. It's more like 2.5. And you went from a x70 series card to a x50 series card pretending to be a x60 series card.
Also, 16GB of VRAM leads to more power usage so in this case i'd say the 4060 is the better choice since the VRAM is kinda useless in games since the core can't keep up ... And in that case the jump to the RTX4070 should be considered the most logical option although that pricing is just too much for my taste.
I went with 6750 XT, I would argue that 6000 series are still good, especially 6750 XT and 6800. 6750 XT is £300 so same price s 7600 XT but a lot faster.
Great comparison and review. Thanks for putting mw3 in there. That’s the only game I play consistently and a lot of the people I know play as well so we always want to know the performance on mw3. I just built a budget rig for a video and ended up picking up a gigabyte 4060 ti for 340$ on Amazon when they dropped down. I had strongly considered the 7600xt but figured having an Nvidia card would be be more attractive to the audience. I personally would have chosen the 7600xt if I was keeping it myself.
ill stick with my rx 6600 not enough newer games I play that use more then 8gb of vram @ 1080p High/some 1440p I hate the whole up-scaling and frame gen stuff. Miss whenever we tried to push the highest frames by raw power now were reliant on upscalers, and software to improve our fps instead of actually getting a more powerful GPU on the next gen. Whenever I saw the 4060 target 1080 with frame gen and dlss 3...and it trade blows with a 3060 I kinda just don't care about newer stuff anymore. i will say I'm happy that AMD and NVIDIA made these amazing features but, seems more like it should've been targeted for older gen cards to breath life into (NVIDIA mostly...AMD's FSR is on every GPU), and not be thought about when making a brand new GPU....just gives up sub-par new gpus with a small life span, lower vram for what we need, and idk about others but I want a native 1080+ High experience not a DLSS, Framegen, "1080/1440/4k" res. Yeah it looks nice but some people just don't want or use it. If you need better performance...buy a better gpu
I play like MW3, BF2042, Halo Infinite, Forza, n some other games. A few get into the 7gb+ range and sometimes at 8gb but, the thing is most of the time you'll be hitting a gpu issue before a vram issue with a lot of cards. Like look at the rtx 4060ti 8gb vs 16gb. performance wise their the same. Later on as newer games come out sure it'll be better to have 16gb of vram but, also the gpu it self will be the limiting factor. Tho the 8gb version would be slightly worse but you could solve it by lowering settings. @@megadeth8592
I'm just gonna buy the hightest end card whenever AMD launches the 8000 series GPUs and just rock that until a year later and it can't run anything at low... jk but newer stuff isn't lasting as long as previous gen and is costing more and more. But thats how it goes. @@megadeth8592
to be fair neither of these cards needs 16 gigs , 10-12 gigs with a wider mem bus and a bit lower price would have been top notch and no one would complain about it.
Comparing gpu's is a mess now a days! Neither of these cards will run AAA games very well in 2 years anyway as they're mostly esports cards and power usage is way different between the 2 as that alone can cost $20 more a year on the XT.
because they're chiplets. GDDR6x is just higher clocked gddr6 and amd gives u more total mem bandwidth across price competitors so it isn't that. Chiplets on the other hand cause tons of less immediately obvious issues. Apparently tho rdna 3 is also nearly unusable in VR for the same reason?
I actually appreciate this video because I just bought my 1st gaming PC last month and it came with a 4060. Using a 1440p monitor and it’s been working well so far. I wanted to download Starfield on Gamepass just to test it out until I saw it was almost 150gb. Yeah… no lol
Just get the 4090 and stack (2) 78003D's on top of each other and play at 720P and get 1 billion frames on every game no need for either of these....😊 I think the better choice will be Nvidia overall but honestly playing with either of these cards and having to lower settings at 1080/1440 is just sad for a $300 GPU.
miss the times when a gpu was forced to be made good and powerful because they did not have all the fancy upscaling from nowadays, gtx 1080 ti was the last of that kind.
Alan Wake 2 looks like a blurry mess on native 1080p, only way is to use DLDSR 1440p and DLSS balanced or performance, for way better image quality and similar performance as native 1080, people need to say it to those with 1080p screens that native 1080p is dead in many new games.
When AMD released the 6500xt , Nvidia released the 1630. When Nvidia released the 3060 8gb then followed up by 4060 ti , then AMD released this 7600xt, and r5 5700, then Nvidia just decided to counter them with the 3050 6gb .🤦🤦🤦
Both have low memory bandwidth. It's why both are a mess in newer games with per pixel rendering, and heavy asset streaming. Having extra vram can, help but it's not nearly as important in these newer games as having more memory bandwidth. It's why the 3080s lead over the 6800 increase as you increase resolution. It's why the 3080 loses the the 6800 at 1080p but beats it and the 7800xt at 4k. It's why the 3060 tis lead over the 7600 xt increase as you increase resolution. It's how the 6800s lead over the 3070 ti decreases as you increase resolution. A ton of hype went into vram amount because 3 games used modern assets but didn't require but still supported hdd. They didn't commit to asset streaming and per pixel rendering like most games since have. I wish people would get off that narrative and actually show that memory bandwidth is what matters. Even the old tests trying to prove vram amount mattering always had the card with more vram also having higher bandwidth. Like 6800 vs 3070. Not going to mention the channels (wasn't this one) but I feel they had a narrative to push and knew that using a 6800/6800xt vs 3080 wouldn't show what they wanted.
This! I overclocked my GDDR5 GTX 1650 Laptop GPU memory and went from 128.1GB/s to 158GB/s of memory bandwidth. The average net performance bonus was around 8%, with the best cases at 12-13%. This is also the reason why the RTX 4050 Laptop GPU can't beat the RTX 3060 Laptop GPU without using DLSS, as the 4050 Laptop may have faster GDDR6 but it only has 3 memory modules, resulting in a 96-bit memory bus and 192GB/s, whereas the 3060 Laptop has slower GDDR6 but 6 memory modules, resulting in a 192-bit memory bus and 336GB/s of memory bandwidth. The only reason why the RTX 4050 Laptop is quite close to the RTX 3060 Laptop (around 10% behind) is because it got a cache size increase (12MB L2 cache for the 4050 Laptop vs. 3MB L2 for the 3060 Laptop). I overclocked my current RTX 3060 Laptop memory and its final memory bandwidth is 378GB/s, catching up to the desktop RTX 3060 12GB memory bandwidth.
No lol the reason the 3080s lead over the 6800xt increases with resolution is two main reasons. One, it's a weaker gpu but the main reason is lots of cache enables much high throuput at lower CU/core utilisation such as when CPU bottlenecked at lower resolutions. BTW that's literally all the so called "driver overhead" advantage really was and also why u stopped hearing about it with the 40 series. They added lots of cache too, amds advantage disappears. That being said ur much more correct with these cards but still kind of misguided. The 4060 compute wise is like 25% stronger than the 3060 with no mem limitations. The vram bus becoming an issue is because they lost over a quarter of their memory bandwidth that also now has to feed significantly stronger gpu leaving the cache to now play a game of mitigation not performance boost. We could see with the 4090 when the mem bandwidth had still just been kept equal to a 3090ti, the cache has been ridiculously effective regardless.
I don't think the 3060 tis lead over the 7600xt increasing as resolution increases can be explained as driver overhead. Not when the CPU used for testing is a 13900k or 7800x3d. The reason it's lead increases is the higher memory bandwidth. Same thing happens the 3080s lead over the 4070 also increases as resolution increases. Because it has more memory bandwidth. The 4070 has more vram and more cache, but the 3080s lead increases as resolution increases. If cache or vram amount was more important than memory bandwidth the opposite would happen.
@@Sp3cialk304 U missed my entire point. When not hilariously underspecced like I just outliined the 4060 and 7600xt both are, (while the 6800 is not) it's not very important, as the 4070ti super proved. Do you get some benefit? Sure but that card also conveniently proved that the cache mattered way more, hence why it underperformed vs expectations by so much. The 7600 xt should absolutely have a 192 bit bus or a lot more cache because this time the cache got 0 increase in capacity and mem bandwidth is barely better. Also the other reason ur seeing that scaling improvement at higher resolution is exactly why mem bandwidth as the be all and end all is stupid, it's literally not benefiting almost at all from its higher bandwidth until higher resolutions unlike cache which improves both bandwidth but especially access times (and hence ipc) a ton regarless
@@Frozoken cache doesn't seem to help the 4070 when compared to the 3080. Looks a lot like the 3080s higher bandwidth helps a lot more than the 4070s higher cache. Same thing with the 3060ti and 7600xt, same with the 4070 super and 3080 ti. The card with higher bandwidth in every situation improves its lead as resolution increases. The cards with the extra cache fall further behind as resolution increases. It's pretty obvious which matter more.
lol let’s check out these new $300 gpu’s. Start off with two games they can’t even run at 1080p lol. Pc hardware in a terrible spot. You have to spend $400plus to get decent gaming experience
In India the price difference is aroung 6k INR so here rtx 4060 is cheaper than rx 7600xt. So I Guess I would with rtx 4060. There isn't much difference TBH
7600xt are stupid card for buy for its price and 4060ti are nothing much better!! overall both of this GPU are good only to avoid and wait bigger price drops and only from new gpus what now worth something to consider are 6650xt ,maybe 7600 ,arc 580-750
Nvidia is close to AMD even with 8gb of graphics memory. Imagine this having 12gb graphics. It would be overkill for 1080p gaming and have entry level 1440p. 4060 is better for 1080 p on this case.