It's a real shame that Daniel is genuinely doing free financial-related math for the lowest strata of human society deserving of all the persecution going their way: gamers. The guy is super selfless, but he's redeeming the irredeemable. P.S. I'm building a $4k computer, so don't @ me in disagreement
@@ilyadonskikh1868 no one cares about your opinion lmao, the game industry is way too big for you, even if you tried you wouldn't understand anything about games, maybe that's the reason your a troll
The skeptic in me noticed that Nvidea did not differentiate between the 12 and 8 gb versions in any of their graphs. Comparing the 2060 numbers to the '3060' numbers it kinda looks like they are using the 8gb 3060. IF that is the case, this is going to be another addition to the 40 series' trend of disappointment.
It will have the same performance (if the only difference lies in the size of vram) in games that do not rely on or require large vram. The new difference is felt in games that demand large vram
@@bluefinder4052 VRAM is not only difference its 128bit bus vs 192-bit bus much lower memory bandwidth resulting much less performance in some games even 20%
Hi Daniel! You calculated the expected 4060 performance assuming that "1.2x" is the relative performance to the 3060 12GB card, while it might be "1.2x" performance of the noticeably slower and newer 3060 8GB variant (which is ~0.85x of the 3060 12GB).
True. Since they list the specs vs the 12GB model in the previous slide I gave them the benefit of the doubt, but perhaps we will be disappointed and find out otherwise.
your question is already answered in a YT video or maybe answered by yourself by counting pixels or length of bars on press release oof NVIDIA 4060. some of us aready know the official answer using Nvidia data.
You're an awesome uploader for gamers/price concerned buyers (as we all should be regardless of your budget). I swung on a 6800XT in October BEFORE the new announcements because I actually felt confident in my purchase thanks to your videos. Spent $300 out of pocket (MSRP was 550) after selling my 2080 Super, and couldn't be happier. Really nice to see a uploader that is focused on good content with very indepth work with numerous settings/gpu's tested. You're awesome dude.
I did similar. Got $200 for my birthday from family and sold my old 2070 super for about 150. Just about broke even on my 6700xt and am very happy with the upgrade. Should be set for quite a few years as I only play at 1080p most of the time. My CPU is getting a little long in the tooth though.
I am definitely with both of you. I was waiting for a 7800xt for my 5950x build. But I don't know if they're even going to make one. I'm not really a heavy gamer. So I bought a 6750xt for my brand-new 1440p monitor. Only 320$, should last me at least a couple of years. Hopefully next-gen will be worth it.
It can't without being a different die. The bus width is compatible with 8 or 16. Unless we saw a return of 3gb modules but even when they were used they were rare.
@@Hugs288 Both 7600 and 7600xt are supposed to be Navi 33. It won't have more ram unless it's doubled to 16. Navi 32 is supposed to only be the 7700/xt. If they lower 32 to 7600xt there will be little variance in that and the 7700/xt cause there's no way 31 will cover everything from 7700 and up.
only for gaming, i would still advise 3060 12gb for the same performance, and much better performance for anything ourtside gaming like ai related stuff or blender or powering a tablet.
@@__-fi6xg? You do know a lot of gamers will not use half those features right? I had an rtx2060s which was only used for gaming. Now using an rx6800 for 7 months and it's been a much better card overall. I only "game" 6700xt is the better card for gamers over the 3060ti.
@@iancurrie8844 yeah but does it even matter for gaming? Imagine this tough, you are creating wonderful art with stable diffusion. 6700xt speed, 1 minute per image 512 by 512. 3060 12GB 5 seconds tops for same image, now THATS a difference that matters if you value your time, oh and you cant use half of the functions with an amd card. like making your own models because training it, would take you several days instead of 4 hours.
I love the way the video is structured and divided with timestamps, the commentary is extremely helpful and you answer many of the questions on our minds. Thank you, Daniel.
Why is noone talking about the temperature ? The 3060 is running between 55-60°C while the 7600 is between 75-80°C. That means the 3060 runs 20°C less hot! Impressive and very important if you want a silent PC.
By default Nvidia card typically prefers to run under 70c and will start cranking up the fan when it reaches 70c, while AMD card since RDNA2 (i think) have more headroom up until 85c. This means the 3060 is noisier than the AMD card since the AMD card can have more relaxed fan curve while the Nvidia card have to use relatively more aggressive fan curve.
I found it interesting in DE8auer's video tour of the manufacturing of cards yesterday, that they still appeared to be putting together new rx 6700 parts. So I guess that means that these cards won't be disappearing in the short term. the 6700 is around 290 and the xt version is about 330 right now. the vanilla card is 160bit with 10GB and the xt is 192bit with 12GB. Both are also 16x PCIE4.
Prices keep dropping! Just bought a new Asus TUF Gaming 3060 12GB OC in Taiwan for NTD9000 (US$300). Saw MSI 3060 TI 8GBs for NTD10900 (around US$350). Amazon prices seem to be in the same range.
Thank you Daniel for this comparison, I just bought the 3060 12gb and I am so happy with it, I use it more for video editing. The 12gb is something that I really needed
I totally understood what you mean about the different percentages, to be honest I don't know if you manually calculate the percentages on the side and then write it on the video but IF you do everything manually and not with a software, I would change a little bit the stats: Let's say in a particular game the 7600 is a lot faster than the 3060, I would put the 3060 as the baseline 100% and then the 7600 would be 1xx%, meaning that the card would be xx% faster than the 3060 and viceversa if the 3060 is faster Not sure if I was able to explain myself though. Other than that, good video! ^^
You mean, you want him to put the weaker card always as the 100% baseline, right?! The thing is, depending on quality settings, resolution or if RT is enabled, the weaker card might keep swapping with every change.
@@MaxIronsThird Yes, that is true and it would totally change depending on the game, settings and RT. I was just talking about that because it would be easier to read as he explained during the video, having the slower card as 100% baseline would always mean: "Okay, the X card is (for example) 20% faster than the Y card that is the baseline" While as he said, it's not reversible so the percentages are different if the higher tier card is the 100% baseline. Obviously if it's time consuming he can just leave it like that and eventually we can do the proportion for a specific game or find different ways like some recap graph
The reason people want the 12gb rtx 3060, is because nvidia has great productivity performance. However, the 3060 not only has that, but has the 12gb vram buffer, which helps it there a lot. So a lot of people bought the 3060. And the 3060 has what we expect midrange GPU's to have, that is 12gb of vram by now.
It depends on the user. 90% of the time here, I should 100% choose the 7600. (exception being RE 4) I don't see myself playing any lower than around 85 fps, no matter the title or genre, and almost every time we were above that, the 7600 did better. I would agree 8gb isn't enough on the 3060Ti (even tho I have it, and don't run into problems for my preferences), but the 3060 seems a lot more niche. Again, it's very user dependent at this tier of cards, but a 4060Ti with 8gb is ridiculous.
@@Whales_ I'm with ya mate. But for a gpu that's barely 15% faster than the 3060, uses about as much power, I'd have expected a significantly lower price, or at least 10gb of vram at its current price.
"Productivity performance" - no one that *needs* productivity performance would ever get a xx60 card. That's just nonsensical. If I absolutely need CUDA/Optix/nVidia exclusive stuff for work, I'd get a 4090.
@@galandilvogler8577 Not everyone can afford an rtx 4090. Especially outside US. Neither can they afford the other more expensive GPU's. Also, there's a lot of different productivity applications. RTX 3060 is decently suited to them. Its really helped by its relatively large vram. Afterall, its the cheapest 12gb vram nvidia gpu. Useful for folks who need that vram.
The 7600 is more silicon-efficient, that same performance and price is achieved from ~30% less transistors and ~48% less die space. That's not flowing through to the end consumer in any way though. And yeah seeing the power consumption figures the same in these side-by-sides surprised me. 2023, the year of meh, continues.
@@greebj That ain't the worst part. This rx7600 was originally meant to be the rx7600xt. Thats why it shares the rx6600xt specs. And the rx7600 was sharing the rx6600 specs with 1792 cores. So compare the rx7600 to the rx6600xt and you'll see that this is clearly just meant to be a low end GPU and its barely faster than the rx6600xt. Maybe like 10% at most. And the rx7600 with 1792 cores would've been around 10% faster than the rx6600 itself. So I am assuming AMD is just upselling us low end GPU's, like nvidia is doing with the ''rtx 4060'' that uses a AD107 die. Something reserved for the xx50 class.
Surprising to see the lowly 3060 still hanging in there. I've often thought it only got 12gb because Nvidia planned it as a 6gb card but found out it was just too limited and doubled the vram. If that's true is just shows even more how brainless it was to design new cards above a 4050 with the bus to only use 8 or 16gb. Unless Micron suddenly decides to make 3gb GDDR6 modules this is going to be a rough gen for the low end. If you're going to be bound by 8gb anyways you may as well get a used 3070 when prices are dropping quick as people want to get off 8gb cards.
It was given more vram only because of gpu mining. Since more Vram will mine more data gamers just got the side effect look at the 3070 for a reference on Vram struggles, and that card is mostly for gamers that owned a 1080 ti Nvidia's biggest regret since it was given 11 gbs of Vram and people didn't need to upgrade. Jensen Huang even made a comment saying my pascal friends it's safe to upgrade now and those same people mostly bought the 3070 since prices got so inflated. And 3070 buyers are now starting to get crippled by there vram on higher resolutions.
@@azellcreegis8536 Vram has zero effect on hashrate. Otherwise the 3060 would be a better miner than the 3070 was, which it wasn't. Vram only mattered if you didn't have _enough,_ and even if the 3060 had only had 6gb instead of 12 that's still plenty for the Eth algorithm everyone was mining. (You only needed 4 which is why people were still mining on old 4gb GCN cards as well.) Let's also not forget the first 3060s were LHR models, and it took far longer than anyone expected for that to be defeated, meaning Nvidia actually put some real effort into enforcing it. Now that wasn't Nvidia pushing back on mining of course, it was most likely to incentivize miners to stay at the top of the stack where Nvidia was selling cards at insane prices. GPUs and CPUs are designed years in advance of actual production. To think a GPU die was made with mining only in mind is naive. Otherwise you'd see every GPU in the stack get the vram treatment. Every GPU has a certain number of bus connections that's hard designed into the silicon die itself and can't be increased. You can stack modules on a connection like is used with the 3090 vs the 3080ti to double capacity, but you can't just add an extra 2gb module to a 3070 to make it a 10gb card. The "plugs" are already full. It's FAR more likely that I'm correct that they started testing 3060s at 6gb and realized there were a lot of current games that suffered the same way we're seeing 8gb be a limitation today. Since you can't redesign the buswidth quickly to add one more GDDR6 module, they simply doubled the memory, and doubling 6gb for the 3060 is obviously cheaper than doubling the 8gb used on the 3060ti and 3070. Where Nvidia is failing was thinking 6gb wasn't enough then, but that 8gb would be enough now. 8GB is fine for a 50 class, but the 60 class and up should have been _designed_ to be 10gb+ since you can't adjust that at the last minute. It's either double it or nothing. Until Samsung and Micron decide to make 3gb GDDR6 modules, this isn't going to change.
You can't really say the rx6650xt's will stay at that price. Their supply will dry up, and thus the price will increase. So soon enough the rx7600 will be your only option below $300. Because the rtx 4050 will not exceed the rtx 3060's performance and will have just 6gb of vram. Likely gonna cost $250. And against that at $250, rx7600 might do alright.
I think the 3060 is still pretty good if you can get it for $250. I've used fsr and dlss and dlss is much better hands down. Plus I have a 1440p monitor
@ZeroTurn I don't get very good normal non scaled fps. Infant sometimes it's the same as my overclocked 1660super. But with dlss it works and I can't tell the difference. But with a 400w hp psu I can't upgrade much more
All AMD cards are hampered by the fact that they do not have FSR 3 and AMD HYPR-RX right now. Also they need to release all of their new lineup. I am waiting for the 7800 XT to drop.
There's a good chance 7800XT is not coming. I think there will be a 7800 non XT, that's slightly faster than 6800 non XT. We will see. The Navi 32 die only has 60CUs, which is same amount as 6800. The N31 die is probably too expensive to make to sell it for $600 and still make a good profit.
Hello Daniel I just wanted to say that I love your videos and I can't believe I've been watching you for about one year and a half. Time flies. I just noticed that you have 126k subscribers, I remember when you had 30k or even less. Keep up the hard work and with that kind of growing rate you're going to soon be at 200k!
Don’t forget 30 series is only going to last as long as the current stock, I’m not gonna be a hypocrite and say I have amd because I have a 3090ti that I got new in the U.K on sale for 900, but I will say bad drivers and software support is a myth, the system I use at work has a 6950xt I downloaded a few things like adrenaline and it’s got way more support than nvidia’s software and workflow seems to be on par with what I have at home. Have a good day sir
Also: All the well wishes to you and your family. Always brightens my day to see a new video from you here. Even for the products I would never buy. I am generally interested in new PC tech stuff and your channel is absolutely awesome and on top of everything every single day. You love to see it.
Doesn't make sense. The 3060 may have 12gb but as you can see here it doesn't really have the power to use them. When the resolution rises and you involve ray tracing etc that might need the 12GB the wins the 3060 gets are useless because it still below 30fps. Whenever is something reasonable for this cards power then the 7600 wins. So you are paying more for nothing(3060 is still a little more expensive) if you buy the 3060. You are gonna play better with a 7600.
@@SIPEROTH but does the 7600 have DLSS? Does it offer playable RT performance in most games that support RT? Does it have Reflex? Does it have CUDA? 12GB of VRAM doesn't require a GPU to be powerful in order to utilise it, textures have close to 0 impact on performance but use the most VRAM - if you have the VRAM required then you can use hi-res textures even on a slow card, and nothing has more impact on visuals than textures. 3060 has some extra features that just cannot be omitted because they have been proven to work well. If the company spent time and money developing those features and delivered the best solution on the market you can be sure as hell it won't be free.
These cards are only about the performance of a gtx 1080 (non ti). The budget market has really fallen off over the past 5 years. I think yall should still buy used if you're looking to spend around 300 bucks. Might be able to snag a rtx 3080 or a 6800 xt for that price with literally double the frame rates.
Many people blame 4000 series for low performance, but actually these are perfect cards for notebooks. 4060ti at 30 watts has bigger perfomance than 3060ti at 60watts. That's insane
Honest question. Would you wait until the RTX 4060 to come out to see if the price for the RTX 3060 drop down even more? If the retail price of the RTX 4060 will be $300 flat. Wouldn’t that make Nvidia lower the price of the RTX 3060 to $250? That might make the RTX 3060 an even bigger value proposition and be worth the wait for July? For me that would make it that much more appealing.
honestly, I was kind of expecting (unfairly, apparently) that the 7600 could be both my dream card and late* game card, for a while. I'm genuinely a bit disappointed with the 8 gb vram in it, though I should have expected it. Still need to do more research on if it's worth it over the 3060 12gb, but this, overall, is very helpful with that. this is a convenient video though, as I have been trying to decide if my next GPU should be a 3060 12gb.
It's like when the ketchup bottle is 18oz and says "50% more" than the 12oz, abd people complain that it's only "33% larger." Relative to the 12oz size, it's 150%. It really is a commonly misunderstood concept, and honestly it probably wouldn't hurt to keep reminding people.
Or when I said I moved to a job that got 50% more, then moved back (because it was a Nortel company) for a 33% pay cut. "At least you got a pay rise out of it!!!". This is in a supposedly intelligent and well numerate workplace, and they didn't understand percentages. 100%->150% and 33% of 150 is 50, so 150%->100%. But these supposedly mathematical geniuses went "50-33=12%, a pay rise!!!!".
Do you think the extra 4GB of VRAM on the 3060 would be a tangible benefit specifically for VR? or is 8GB still plenty enough that it shouldn't make that much difference?
VR is always extra demanding, better to pick up more power (always more power, as Vergil would say). However, if VR or high resolutions are not really the target, I guess 8gb can be enough for midrange users. I am also confused with all this nonsense, because I want to move on to 1440p + super sampling, but who knows if 8gb will be enough
As a request and feedback at the same time, can you make a video focusing on the 1440p experience? I guess the PC market is finally transitioning to the current gen, from the outside it seems the PS5 and Xbox are designed to run games at 1440p with super sampling (ideally 60fps), 4k super sampling (pretty much always 30fps), so it would be good for testers to "forget" the 1080p for a while and start focusing on the console experience. Can you please do that? Try to simulate the PS5 experience the best you can, a lot of PC users are willing to upgrade, but if the GPU is not good enough to provide the console experience, then... really, what is even the point? As a GTX 1060 owner, I am sick and tired of 1080p, honestly, time to move on
Really wondering where the RX 7700 and 7800 series cards are... There's like a $500 gap in the RDNA3 stack they've yet to fill, and we're six months out from the RX 7900 XTX's release.
AMD is probably waiting for the previous generation to sell out, where the new gen cards would be a huge threat in selling out those older cards, I think so thats the case here.
@@werwito6723 That doesn't make sense , unless they plan on releasing them in 2024. DeBauer made a video abotu a GPU factory ans they were making 6700s
Its because 7800-7800xt is like the 7600: has almost 0 improvement over the last gen. Imagine they give us 7800 and its like 10% more powerful for 15-20% price increase. Noone would buy that so they wait till last gen is out so people wont have any other options but buy the new "improved" 7xxx series + they get all the money from 6xxx series. I would like to buy a card as well but 6700xt's 12gb is just not futureproof and 6800's price for me is just a big jump compared to 6700xt.
@@951001ify I can afford an RX 6800 but yeah it's a noticeable price increase from 6700 XT. But I agree with what you said about the 7700-7800 as it seems they are playing the waiting game on it and so are we.
Only if te 7600 was a 6600 replacement. It's a 6500XT replacement. Still better than NVidia, where the 4060s need to be replacing the 3050s, the 4070s replacing the 3060s, the 4080 replacing the 3070.
The 7600 is still a 2023 card with more advanced features, hardware acceleration, etc. VRAM isn't a factor unless you're playing at higher resolutions, which most aren't doing anyway. These are budget 1080p cards, if VRAM is a factor, just turn down the settings a little. And no one I know even uses Ray Tracing when gaming on PC, or new consoles for that matter. The 3060 12gb isn't a bad card, but the 7600 (without all the ray tracing, DLSS crap) still out performs this card overall. Go watch other comparisons. There's more to it than just FPS.
Nvidia and their strategy to skimp on VRAM has caused quite a chaos. I mean if RTX 3060ti and RTX 3070 had 12GB of VRAM and RTX 3080 16GB of VRAM, those would be such a good GPUs today and for couple of years to come. But yeah a lot less people would be then enticed to buy cards like rtx 4060ti 12GB and 4070/ti. I wonder if AMD can bring say RX 7700XT with 16GB of VRAM and if we are even getting RX 7800XT at all, as it perhaps should have been what is now called RX7900XT.
while it's easier to add more VRAM Nvidia doesn't want people to hold on to their cards for so long, more VRAM means better longevity and needing more of it is one of the main reasons why people end up upgrading even the stubborn ones who are fine with low settings and fps. so yeah these cards should have had more VRAM but that doesn't benefit Nvidia business in the long run that's you have to pay premium for it, remember Jensen said: the more you buy the more you save!
It's funny how in the US AMD is always a good deal, while in other countries it isn't always so cut and dry. Here in Brazil, the 3060 is actually a good deal since prices got greatly slashed here for Nvidia. AMD, on the other hand, have sold out most of their 6000 series budget/mid range cards. Here the 7600 is also a good deal, because there are barely any 6700xt/6650xt/6600 left, so it is fulfilling it's role as it's pretty much priced exactly the same as a 3060. So what do you prefer in this situation? Both are very expensive by US pricing standards though lmao
Bought a B-stock EVGA RTX3060 12GB for $250. Running games well and it's great to have extra breathing room for VRAM hungry tasks like Stable Diffusion.
If anyone is waiting for the 4060 be aware about the PCIE 4.0 x8 Bus Interface as it will probably not work well on gen 2-3 motherboards... Which could actually affect the majority of the audience who is looking for a GPU upgrade in this price segment. Might be ok, but if you check the 4060Ti on PCIe 3.0 it pretty much performs as a 3060Ti. Let's hope Daniel will cover it if he decides to do some benchmarks with the card
@@kennethpereyda5707 Nice catch, I didn't know this. I'm not too interested in these cards, but it's a bit weird that 99% of the RU-vidrs tend to miss out on this aspect, definitely worth mentioning.
I've checked some benchmarks to get some extra perspective, seems like it's not affecting every game that heavily but when it does the 40 series card almost loses all of that tiny bit of extra performance which it actually offers over the ampere one. I'd much rather trust Derbauer with this than some other random RU-vidrs.
Great video Dan. Recent price cuts to the 3060 12GB make it finally a decent deal! They will clear out eventually as this is clearance pricing thanks to AMD heh. Here in Canada the 3060 12GB is $410 and up still where the 7600 is $360. I still would likely steer a friend to the 3060 12GB when they are that close though for the extra 4GB VRAM. Back in the USA, the 7600 might have to come down to $249 unless AMD wants to wait months for the 3060 12GB stock to dry up!
Nah, it's not a "good deal", in fact, it's embarrasing. 10% discount on msrp after it's replaced 2-3 years later. When you put the math in a spreadsheet, normalized price for 3060 vs 4070 is $298, based on 3d mark timespy. It gets worse for games (dx11, dx12 and vulkan) where the 4070 is better value for the money for $600 vs 3060 12gb $300, not much, but it's there in the margins of 10% and you get more features (av1 encoding, dlss3) and much better power efficiency. The 3060 12gb goes from time to time for $/€200, saw a few this week and sold immediately within 30 min, but even then, it's 15% worse price/perf than used 3070 8gb, which is a lot faster... Only usecase for 3060 12gb is for video editing and ai, not for gaming anything higher than 1080p. Nvidia is still cheaper than AMD putting everything in perspective for EU, but for US, AMD is a lot cheaper. There still are scammer new and used prices for 6800 (€540 new and €400 used) and xt is +€60, while in US could bought/buy the 6800 for $400... whereas in EU can buy Nvidia at exact same price as MSRP, 4070 $600 for €600 inc. vat. I told ignoramuses before why Nvidia has more marketshare than AMD, because they are everywhere in supply AND cheaper than AMD most of the time (new or used), whereas AMD ALWAYS has low stock and/or NEVER resupplying gpu's (cpu's is fine). Not only in EU, but also in SEA, been there and seen it myself over the decades. Radeon group is a disaster for many years and don't confuse that with the cpu group, they are not one and the same, despite operating under the same corp name. The low-mid range the last years has much worse price/perf than the mid-high end, for both nvidia and amd, both new and used than ever before. If you don't need to upgrade, wait for 3nm and skip everything atm gpu related, because 2nd hand prices are still insane as well and greedy as mr leather jacket himself, the hypocrisy of sheeple vs how used ought to be and still are in other markets, i.e. at least 50% discount vs msrp after 2 years for deprecated technology and less than 20% for obsolete tech. People are heavily conditioned already.
@@TheCoolLama That's why I built my last customers system with a 4070. He could not afford a 7900XT/4070ti class and 4060ti/3060ti/3060/7600 were not going to be robust enough to get him to 1440P land into the future.
@@F1neW1ne That's a solid choice and I'm pretty sure (s)he will be very happy. I think they will sell out pretty fast before the next batch arrives. I just noticed all €600 cards are sold out here in the Netherlands and got the most 5-star reviews. Better value than the previous up to mid-high gen perf/price and perf/W with more features and having the latest and greatest.
@@F1neW1ne Checked my spreadsheet again and fetched the latest prices, good pick mate. Amd 6800 here is €540 vs just sold out 4070 €600, choice is easy to make here. 4070 is faster, more efficient, no wild power spikes, less multi monitor power consumption (idle), more features and got > 60 fps for 1440p ultra in wh3 (dx11), metro exodus (dx12) and rdr2 (vulkan). It's even good for 1440p uw > 60 fps (except wh3 49.39fps 1% lows, which still is acceptable with average above 74).
@@TheCoolLama Well ya 4070 is roughly 3080/6800XT territory. I always buy the fastest for the money. Sometimes it's AMD somtimes it's nVidia. I play with enough cards to know it's simply "buy the fastest card you can for your target resolution" and the rest is just fanboyism which I could care less for. Outside of being more unhappy with nVidia for recent and past attitudes. Not a fan of Jensen actually.
Great video, thank you! It's almost ridiculous here in Germany: the cheapest RX7600 is roughly 290€, the cheapest RTX3060 12GB is 300€ with 50% more RAM, better upscaling and faster RT ... It's almost like AMD are trying to lose this. One good thing about this situation: I can finally stop rooting for the "underdog". They are just another shareholder driven, capitalist company.
3060 12gb and RX7600 are equal in price here in Poland, both starting at 300€. We rarely get an AMD GPU that's actually a bargain. AMD was always more future proof due to more VRAM, but even that is not the case between those two cards. In my opinion being able to use DLSS, having actually usable RT performance and 50% more of faster VRAM makes 3060 a better deal as of today, even though this GPU started its life as a pretty bad value product.
@@markhackett2302 But NVIDIA NVENC outshines AMD's encoder. So Im also considering that. FPS is not a big issue since I want to stream games such as Valorant.
@@UncannySense she has a 5600x and a 1660 super 6 g8g and she's easily 1080p at around 60 to 90fps which is pretty good considering all that spaghetti code. Sometimes she hits around 130.i think a 7800 will be much much better.
7600 vs 6650 XT. Set max clock to 2GHz. Clock hte 7600 memory down to match idential bandwitch as the 6650 XT and run some tests. See if AMD has any actual architectural advancements in a pure core for core test.
Laughs in "6900xtxh rog strix liquid cooled TOP edition" preused deal at reasonable price. I feel like skipping the next 10 gens and 50 years.... especially skipping ngreedia. Btw, nice video highlighting the 6000 series (6700xt 12GB)... this is the real elephant in the room for most users.
I'm looking at both cards and in australian dollars the 7600 is $419 and the 3060 12gb is $459. I don't really play a lot of demanding games, the most demanding games I like to play are resident evil 2 remake, gta v and rdr2 and I like to play my games at high 1080p. Could I get away with having 8gb of VRAM with the 7600 or should I get the 3060 12gb and spend the extra $40?
The comparison I've been waiting for. Both are shown to have compelling reasons to buy: the RX 7600 has better raster perf at reasonable settings, yet the 3060 has better RT, 12 VRAM for better longevity, AND the standard feature wins of Nvidia's software suite. To me the price parity is where these cards should be sitting due to these difference, and I hope the 3060 12 GB does stay in production at this 'budget' tier. An interesting thing to note is the power usage, with both cards floating in that 150-175W range.
Do you really care about RT in that tier? When a lot of the times you're not even going to be reaching past 30fps with RT turned on at native 1080p resolution in modern RT titles. Dlss or FSR hardly boosts fps and looks like utter shit a lot of times at that resolution target.
@@lakibadhikari7930 To me PC is all about experimentation, and the Nvidia suite offers that a lot more than AMD. If just wanted pure optimized perf I’d go with a console. Likewise, it’s I’m personally interested in the Arc card, even if their value doesn’t make much sense going forward atm.
@@AwSomeNESSS in also personally interested to see where those ARC cards go. But, if you're looking at things objectively between the 3060 and the rx7600, the 7600 should be a better buy because it also has the AV1 encoder. Personally, I wouldn't buy any of those. I'm waiting for a market reaction and a price drop to these cards.
@@lakibadhikari7930 AV1 encoder isn't needed for most people, you can watch AV1 streams with just about anything. Nvidia's non-AV1 encoder is decent anyway, if you want to do some casual with it. AMD's isn't, so it makes more difference between old and 7000-series when you think what you buy. Here is how it really is: 7600 pros: 10% faster raster on average (there's a meta review that gives 11%), AV1 encoder. 3060 12gb pros: 12gb of vram, DLSS for better low res upscaling, DLAA, faster RT (usable, I'd say at 720p upscale), NVENC encoder is pretty good so missing AV1 isn't as big of a deal as it would be with AMD card, better software support for apps or machine learning right now. Then there are the more niche things like VR support, Reflex or that new weird Remix thing. Power use is very close, basically a tie. I can't really see any reason to buy 7600 if it's priced much above 200.
@@noway8662 I think you can't compare the NVENC to AV1 hardware encoder. You can get vastly superior quality at the same bandwidth or lower your bandwidth usage and still get decent quality. Now with the introduction of AV1 support from RU-vid and upcoming support from Twitch, I think it makes more sense to buy into that capability If you plan on doing any streaming or recording. Dlss 720p RT is not a target I would personally ever aim for. I think traditional illumination techniques can still look very good (sometimes even more apt) and give an artistic vibe. Therefore, I would never consider a 3060 12gb to be a viable choice.
I have 2 gaming pc one with the 7600 and one with the Rtx 3060 I have a Ryzen 5 2600 cpu and a Ryzen 5 7600 cpu my question for you is which gpu should I pair with the Ryzen 5 7600 I play Warzone and Fortnite I stream on TikTok with TikTok live studio
geizhals: MSI gaming X 6800 around 470€, while Gigabyte Eagle 4070 around 630€, ASRock RX 6700 XT Phantom Gaming 400€. Some cooling solutions are even 30€ cheaper. 6800 and 4070 are very close in benchmarks, some games wins 4070 by 10%.
The fact that AMD hasn't figured out frame gen is a major problem. That's pretty much the big selling point this generation. I know some people don't value it, especially because in multiplayer the added latency really hampers the experience, but for single player games it's actually the one saving grace for this generation that might provide some type of longevity. The 7600 is just not powerful enough to justify it's price if you don't have frame gen to boost performance when available. Until they figure that out just buy 6000-series or a used 5700xt.
I agree, fair point. Considering how this PS5 generation is looking sketchy, the quality of the ports are not good... so who knows, better to have extra tools to stretch out the life expectancy of a GPU than wait and see if the devs are going to optimize stuff. This generation is kinda reminding me the PS3 generation in a way, back when I migrated to PC, the (rare) ports were not good, we had no idea if the hardware was "future proof" or not. Usually the first years are rough, any generation, but this particular PS5 generation is looking ugly. 3 years already and we finally had a big hitter, Baldur's Gate 3 looks like a alltime great using current tech (to some extent), thus far the PS5 generation did not presented anything impressive enough imo, crossgen is looking good, but the current gen is looking mediocre. So really, if the current games are not even that good, better to wait and see
Am i the only one missing our beloved RX 6700 10 gb in such comparison? This prices and results don't apply to European market, it only takes searching a website of price comparison. Buying a RX 7600 or a RTX 4060 Ti may make a lot of sense here. I'm missing my erased comentary, also, what a pitty having done something wrong without knowing. Keeps up with your work.
@@Alejandro.budget.gaming thats weird, this channel shows 3060 is 35% faster than 6700 in cyberpunk full hd RT ultra, 3060 with DLSS quality = 72 average fps, 6700 FSR quality = 38 fps, Nvidia almost with double the fps. 3060 is faster in every RT game by a lot.
Nope. The card released or tested should be 100%. THIS SPECIFIC CASE could argue the 3060 should, but that is because the 4060 release info is based on the 3060 too, meaning if the 3060 was "100%", then it could be used twice. It doesn't take much brainpower to do the conversion. The AMD card is 14 out of 86 faster, so less than 20% faster, but more than 15% faster. You need to up 86 by 1/7th to get to about 100, so 14 is increased by about 1/7th, to 16. So I've shown two ways to do it in your head.
Appreciate you are trying to give a balanced view of this, but come on, no one is actually going to be enabling RT on these entry level cards so it's kind of irrelevant how each perform with it enabled.
RU-vidrs : 30 fps is not playable Microsoft: 30 fps is all players need youtubers: 8gb is not cutting it Nvidia: 8gb is great in 2023 These Companys Hate us lol
Thanks for taking the time to explain the math... i know how to do it and i definitely know how NVIDIA would do it... but its very valuable info this days giving the world's lack on education.
When you're at this level of GPU raytracing and productivity arguments are pointless. No-one should purchase such a GPU looking to engage in those functions. All that counts is straight rasterisation.