@@dycedargselderbrother5353 720p is simply out of the picture these days. Seriously I'd rather not playing the game in 720p entirely. Unreal Engine games simply looks more horrendous in 720p, this issue began to emerge in 2015. I mean I can stomach 720p in old games with no issue. But after playing, say; DMC5, Gears 5, and Civ 6... I despise playing any modern games in 720p.
@@nickochioneantony9288 I think (I hope) that he meant the 900p and 720p in the context of handhelds, like Steam Deck and RoG Ally. 720p on an actual standalone monitor.... ugh.
@@GewelReal so? since the image looks identical to native 1080p, what's the problem? What is this obsession with the amonut of pixels? The final results is what actually matters. For example, if DLSS 4 can make 360p images look like 1080p and 720p images look like 4K i will be extremely impressed and happy.
@@StudioKelpie1993 if only people didn't just blindly buy Nvidia anyway. AMD can literally bring the sun moon and stars and people would still buy Nvidia
Yeah because 4060 = 4050, 4060ti = 4050 ti, 4070 = 4060ti, they scammed people but got caught lying with cancelled Rtx 4080 12gb that become Rtx 4070. None of those current gen GPUs are worth the MSRP prices, except maybe Rx 6700 xt
I just upgraded to a 4060 (and a whole new PC to boot). Coming from a Haswell-generation i7 that originally had a GTX 770, upgraded midlife to a GTX 1060. Only been playing with my 12th gen i7 and 4060 for a few days but i'm pleasantly impressed. Much faster then my old set up. I understand people being upset its not a substantial upgrade from the 3060, but who upgrades their GPU every cycle? Coming from an older card its fantastic.
Same. The card is overpriced (but so are all of them) and has lacking RAM (but again, so are most of the Nvidia ones), but it is still pretty decent for people willing to use upscaling, frame generation and sensible settings.
One thing I haven't come across with other youtube tech channels is testing this card on a PCI-e gen 3 motherboard so I'd definitely be interested in seeing how this card with an x8 connection fares against a 3060 with a x16 connector. See if it makes a difference.
I think i saw TechYesCity i belive make a video of the 4060ti comparing it's speed with pci-e gen 3 .If i remember correctly ,the performance hit was negligable ,you might wanna check it out .If he sees performance drop on the new 4060 at pci-e gen3 he may make a video about it too .
If you are a budget gamer and want a strong 1080p card, the RTX 4060 is actually a good card to choose. It has support for DLSS 3.0, including frame generation. The entire 40 series is overpriced for what you're getting, but the 4060 happens to be the one card that actually is VERY competitive in what it offers for a 1080p gamer, and be honest, that is most of us.
Happy that you’re getting companies sending over cards for review. Saves you the monetary fuss and provides everybody valuable information. Well deserved!
@@GewelReal The 1060 was on par with the 980...to call the 4060 a decent card when it can't even match a 3060 Ti is hilarious. Wake up and smell the roses.
The problem I have with the 4060 is that it's effectively a 1080p card that offers frame rates that aren't required. While at the same time, being somewhat handicapped when it comes to higher resolutions due to the V-RAM limit. Making both the 4060 and 4060ti somewhat pointless. The 2060S can play most of the same games at 1080p High/Ultra with perfectly playable frame rates. The only downside with the regular 2060 is the 6GB of VRAM, but for just a little bit more, you get a good bump up in performance and 8GB of VRAM with the 2060S model. Which will play all of the games at 1080p that the 4060/4060ti can for a fraction of the price, still above 60fps.
4000 series is like paying for most broken Early Access games, but this time with the features that can't be utilized in intended use due small amount of VRAM and awful memory speed and narrow bus size that chokes those cards constantly. Mid end 5000 series should be much better GPUs, I hope they will get way more VRAM but also Rtx 4070/4070ti perf for below $400.
No, PCIe 2.0 vs 4.0. E.g. test on the Xeon. I wonder if 2060 can give higher performance in some games (like The Last of Us) thanks to full 16 lines compared to 4060 on old PCIe 2.0. Based on my tests I can tell, that 1650 Super paired with i7-2600 can utilize PCIe 2.0 x16 bus on up to 70 % already while playing The Last Of Us on +-Medium settings w/High textures (30 FPS average, 20 lows, 50 tops).
Yeah I am still running an EVGA 2060 KO, only recently went to 1440p it still does well, I just cap framerates on many games to stop 2060 from running maxed out. Even though the 4060 ti is not received well I am probably going to get it over an AMD for the lower wattage and heat generation in my small gaming room.
I can absolutely see your point. While I do not like the rtx 4060, it cannot be ignored that the efficiency gains and lower power requirements are quite attractive, especially for hotter regions of this planet.
Good video. Looks like a good card to me, if you're upgrading for the first time in a few years and all you want is modern games running 60fps at 1080p then it's all you need. Made the decision to get it comparing the AMD equivalents and this one runs cooler and more efficiently at the cost of 10fps less than it's equivalents, but it's still over 60, and that's all I need.
Exactly. Everyone losing their shit over how terrible this card is needs to take a step back and think how ridiculous they sound. For a casual 1080p gamer this is a no brainer. I got a new one off ebay today for £240. And I'll probably get £80 for my gtx 1070. So £160 for a card that will do me for 5 years and uses very little power is fine by me.
I want to see that frame gen on old pc but also i want to request if possible if you can get a 6700/6700xt and test it against the 4060 in the same old system
The 1070 is pretty old at this point but because of the larger memory bus width the card scales linearly with resolution. The 4060 performs ok-ish at 1080p but will struggle with the newest titles even at just 1440p. Super disappointing generation.
@@AnimeUniverseDE Yeah it is dissapointing generation. I cant wait for 5000 series because this generation is focus more on power efficiency and 3000 series was more about performance. Hopefully 5000 series will combine both
Yes, please.... would like to see older PC v newest PC comparison for the RTX 4060 especially PCIe 3.0 v PCIe4.0 comparison.... as my PC uses the older standard. Also interested in GTX 1070ti (my video card) v the RTX 4060, is the upgrade worth it? In light of vram limitations on both cards. I am trying to decide best upgrade path, but so far I haven't found the best one. I like Nvidia cards, but the low vram has me wondering what to do.
@@AuDiGo6 You are most likely correct. In fact, I believe the RX 6700 is probably the best bang for the buck choice. But there are other considerations, most important of which are my own biases. I don't want last gen old cards and I prefer NVIDIA.... so I are frozen on what I already own.... the GTX 1070ti. What I would like to see is a head-to-head comparison 1070ti vs 4060, so I can judge the value of the upgrade (to me).
Just upgraded from a 1660-Ti to a 4060-Ti, couldn't be happier. FPS is a little higher than your reviews with the Ti version. If you do 1080p/1440p gaming it's a perfect budget card, but get the 16GB version if you want to do 4K (or a 4070).
Correction: if I may, you seem to perhaps be confusing “DLSS” with “Frame Generation”… not a huge deal but I’d like to clarify it for you, friend; DLSS is Deep Learning Super Sampling (NVIDIA), and is a suite of functions - one of which is how the video card actually renders the game at a lower resolution than the setting chosen in the game. For example, if setting a game to 4k Resolution (2160p) and then [say, to improve performance because you have to] you turn on and configure DLSS to run at “Ultra Performance” - then you are actually telling the videocard to render the game at only 720p… and then it ‘stretches’ the image up to match your monitor’s resolution (or the chosen resolution you’ve set in the game’s options). [This is known as Super Sampling] Frame Generation, is how the videocard will ‘fake’ frames in-between other frames, ‘guessing’ at how the next shown frame will look, based on differences between two analyzed frames. For example, if two frames look almost exactly the same, except for a tumbleweed that rolled and moved a bit in one direction, then the Frame Generation programming/code/instructions will analyze the differences, in this case find that only a few pixels changes for the tumbleweed in the middle - then it will actually use that information to ‘fake’ a frame in-between those two frames and use that, instead of actually rendering that in-between frame. [“A.I.” is just programming, just typed code that the software runs, to analyze and then predict/fake the in-between frame] Frame Generation (AMD has this too, called Fluid Motion Frames), actually HURTS performance always - as that extra calculation to do the generation, the 'faking of the frames in-between' takes time - and it causes LATENCY a.k.a. percieved delay, between human input (moving the mouse) and the results on the screen. Those “A.I. generated frames” (‘faked and estimated’) are not actually responding to the input from the player, they are more like static paintings on the screen, so they cannot be moved or changed by whatever the player does. [NVIDIA has “Reflex” to try to combat this performance hit / latency from Frame Generation. AMD has “Anti-Lag” to try combat this performance hit / latency from Fluid Motion Frames.] To be fair, AMD has the exact same functions overall as DLSS - AMD calls them “FidelityFX”, and it is a suite of functions that are the exact same as NVIDIA’s “DLSS”. For example, NVIDIA has Upscaling as part of “DLSS” (where the videocard renders only a small version of the actual game/frame, then stretches it up to match the desired resolution/display). AMD has this and it is called “Super Resolution” (where the videocard does the same thing on an AMD GPU). [In both cases (NVIDIA and AMD) an anti-aliasing effect slightly occurs during this Upscaling as well, which helps ‘get rid of the jaggies’ at the edge of shapes rendered] I’ll stop there sorry lol...I’m just trying to help clear things up a bit, as it sounds like you are saying “Frame Generation” as though it helps performance, but Frame Generation actually hurts performance; it is the “Upscaling” part of DLSS (where the GPU renders the game at a smaller size then stretches it up to fit the screen) that actually improves performance (and the higher performance setting you choose, the smaller the game is actually rendered at before stretching it out). HTH mate
I pretty much enjoy reading the comments here; how people are looking into the positive than the negative, being optimistic than pessimistic, I never upgraded my pc for a decade (Two core duo with no GPU). I had no experience with pcs till in September 2023, where I begin to study more.
I'd like to see the 2060super and 1080ti with drivers released before the next generation in games released before the 20 series launch vs 4060 and 7600 and 3060ti
Awesome video as always, congratulations. I would like to suggest adding Street Fighter 6 to the benchmark of the graphics cards. The game is new and it's kinda demanding to run the World Tour mode. You can use the Demo, it has the World Tour. Cheers
Thanks for the review and other than frame gen and power consumption which are important for some people certainly 300USD is too much for an 8g GPU in 2023. If someone is fine with an 8g card and they want Nvidia the used 3060ti is the deal right now IMO. I am seeing them locally for 250 or even 240. Its a faster card , same vram , has more bandwidth and still has NVENC, DLSS an RT if anyone wants to mess with that.
The 2060 was one of my favorite gpus! I got one in 2020 right before the gpu market crashed, and used it until 2022. It treated me well at 1080p and its great to see it holding up well these days. I ended up upgrading to the 4070ti, but I gave my 2060 to my cousin who plays esports titles. So its still chugging along no problem
But his settings carefully masking the ugly face of the card. I'd say this isn't an optimized settings as much as it's a settings to mask Nvidia greed.
I've just bought myself the same specifications as you've done for 700$ ( in my country we have 30% tax on everything) so this clears a lot of shrouded mystery on this capture, thank you
@NguyenVu-dk3pv I believe so, in some games It might bottleneck at 118-120fps at 1440p because of the CPU, but mainly I've got the pc to play battlefields, battlebits, hitman series and more, I'll do a proper testing and reply here just to give some clarifications, there's quite some reviews with Harry Potter game with the same spec which got around 89-110fps with 4060, and 120-142 with a better cpu and memory, I believe you can pair it with a much better cpu if you can, however everything in my country is sold out from AMD
i know that the 4060 and 4060 ti are a rip off for someone who wants to upgrade for 20 or 30 series but think of someone with a earlier Gens like 10 series i think it's more suitable for upgrade but that Vram Limit is Frustrating actually
I had to replace my dying 5700XT and after being sick of stability issues I really didn't want another AMD card which left the 4060 the only thing in my price range. While the performance is meh and definitely not an improvement over the 5700XT the one thing that has impressed me about this card is it's so power efficient and quiet. without a graphics card sounding like a jet engine during take off I can game without headphones on which is a massive quality of life improvement. I wish the performance and value were more of an upgrade from a card that cost me the same in 2019 but I will take quiet running and not crashing randomly over more frames.
I don't understand why ppl shit so much on the 4060, of course if you a strong 30 series you won't be buying a budget 40 series, but if you want a good upgrade from an older gpu and don't wanna spend too much, this is incredible... especially if you don't want to upgrade your whole pc. it's power efficient, it has access to newer software and drivers. Just because it isn't a HUGE improvement from the 30's, it doesn't mean it's bad.
finally someone who uses his brain, im going to upgrade from i5 7400 1050ti and 16gb ram to i5 12400f rtx 4060 and 32gb ram, it's gonna do just fine for what i need.
Nvidia is not quite telling the truth about power consumption. Igorslab has tested the card. The true power consumption is about 130 watts on average. The idle power consumption is also very high. This seems to be a driver problem.
@@tyre1337 3:27 90%, 5:45 82% 6:07 66% 7:14 80% 9:23 80% 13:30 83% 12:02 41% and this are only the figures on the recording footage, they might have dropped even lower in the respective benchmark run. 4.0GHz ain't gonna cut it in 2023.
@@samserious1337 esports and old games will have low gpu utilization regardless of cpu used, at HUNDREDS of frames per second you're more likely hitting game engine limitations than cpu limitations
For me this card is the best choice. Any alternative whether it is AMD or older Nvidia is either worse or more expensive here. 1080p is great for me even though there are games that I prefer to play in 1440p like Hell Let Loose, but the card is strong enough to take it even in 1440p
What’s the advice if you’re looking to got from the 2060 super to the 4060? I also need to upgrade my cpu. Is it be at to upgrade cpu first then gpu to avoid a bottleneck?
i thought i liked these videos you make cause i liked computer hardware more than i really do i realized the real reason is cause i like you you make me feel comfortable Simon
This is definitely more of a $200-225 card, especially when the RX 6700 series cards are giving it a run and in a lot of situations, exceeding it. Nvidia is looking for a premium for improved DLSS hardware, which is fine, except these upscaling technologies are being used as a crutch by developers at the expense of raw performance in a lot of cases. PC gaming in a not so great space right now as a result for sure.
My question is who exactly is this card intended for? Adding a card like this to an older Skylake/Kaby Lake era CPU and PCIe3 motherboard is a complete waste of time and money, frame generation or not and I wouldn't even consider this card for a newer build. I have a Lenovo m910t from 2017 with an i5-7500, 16gb DDR4 2400mhz ram and PCIe3 paired with a AMD 6750 XT, which is roughly equal to this card. In every single benchmark the potato i5 is the bottleneck with the graphics card rarely using more than 100-150w and never getting a chance to raise its voice. Adding an i7-7700 did nothing to address the RAM and CPU bottleneck inherent in older pre-built motherboards. On the flip side Steve I'm really happy that the manufacturers are starting to pay attention to your channel and the massive work you've put into it through the years. Without a doubt your channel is my favorite old hardware in modern times comparison and I genuinely look forward to new postings.
Palit is the biggest AIB on the planet by sales volume. Asia is just their primary market. I had a 980 Ti Super Jetstream and a 1080 GameRock and they were two of the best cards I've ever owned in terms of cool and quiet operation. As for "reputable", which AIBs do you consider "reputable" then? Asus seem to be the go-to brand for people easily influenced by a large marketing budget, despite the fact that they're a terrible company and many of their products are garbage.
Frame Generation is a game changer for Budget Builds, That being said this card is a travesty, Its a 4050ti at best and should be priced 30%-40% cheaper.
@@BertieJasokie free fps for more input latency and Bad visuals is not free haha. i tried it on my 4070. If you get 40fps or less without it, dont even turn it on. Horrible.
@@theplayerofus319 Dude, budget gamers play on a much lower tier of cards. Sacrificing visual fidelity and resolution for playability. The slight input lag and visual glitches are not gonna stop us from using it. DLSS/FSR/XESS has been a saviour for us stuck on older systems.
@@BertieJasokie i use dlss/fsr too and its great, but Frame Generation just downgrades the game for me in too many ways, just saying. Doesnt mean you now cant use it 😅
From the budget gamers perspective I was interested and hoping to see how it performed without frame gen in PCIE 3.0 systems after cutting the physical connections from 16x 4.0 to 8x 4.0 it would be interesting to see if there's a performance hit to running it at the equivalent of 4x 4.0, either by artificially limiting it in the newer system or running it full speed in a 3.0 system. The xeon may be ancient, but 10th gen was not all that long ago, and I'm sure there are plenty of them still around, along with ryzen chips that either don't support 4.0 on the CPU itself or on the motherboard side. To me it feels really weird to put cut bandwidth to your budget GPU even if it doesn't need it on the newest platforms when the prior gen isn't that far away. I wouldn't think "wow that's old" if someone told me they were running an 8th gen chip, and certainly not a 10th gen (a group that includes me). More than a 70 or 80 class card 60s and 50s are probably pretty likely to be paired with older platforms, and by older in this case I (apparently) mean 3 whole years. It almost makes 11th gen better in retrospect, after it had been referred to as a "waste of sand" for not being a consistent improvement over its predecessor, because this waste of sand (for the same reason) is going to get its full bandwidth starting at that generation (board permitting, I don't think every manufacturer got PCIE 4.0 support when/if their 400 series boards got 11th gen support)
@@baoquoc3710 10th gen chips don't support pcie 4.0, the 400 series chipset can also support 11th gen with a bios update which does support PCIE 4.0...
Is the old system PCIE 3.0? This card is 8x lanes just like with AMD's more budget range from the last gen and on so wondering if that would be a bottleneck in a older system with PCIE 3.0 compared to a nvidia from the prev gen or the intel cards which have 16 lanes.
I heavily doubt such a midrange card is really going to be a big problem for Pcie 2.0 x16/3.0 x8, but it'd be nice to have a comparison for sure although there should already be some good quality ones out there
For a build using used parts like I7 8700 ( non k) would this card be a good pair? 1080p only, I was thinking on having 2070 but the consumption is a bit high.
The purpose of framegen is in the end destroyed by the 8 GB VRAM it has. Enabling settings where framegen is worth it, will eventually slow the card down to a crawl because it needs more than 8 GB of VRAM. Especially when enabling RT such things will happen. Also, framegen itself also requires a bit of VRAM. So if you would get this card, just DON'T get the 8 GB version, but the bigger one.
My big problem With FG is the visual downgrade. In cp 2077 its unusable for my 4070. Everything looks so weird. Bushes just fissle all the time like hell
Frame generation isn't a "get out of jail free card." It just makes your game look smooth. You're still gonna feel that low framerate if your real fps is low.
the price is not good but I can't get over power draw... or lack there of. also it's a great card for those people who don't wanna upgrade their PC's & have older systems.
Hey Random Gaming! Long time fan really like the video. As I slowly get murdered by international entities of organized crime and people within the United States of America, I would just like to say thank you for the view of a great life. I could only imagine having a friend in real life like you. I find everything you make a video about great and can't wait until the next one. Really. Thanks. Please when I Jacob Joseph VanStraten die soon I just want you to please keep making videos. Thank you again. God bless you. I'm still waiting for the next one. I always will be. Good by my friend.
Bought a used 2070super for €150 afaik its same or better in performance than the 4060 so that might be a good used card if it can be snagged for cheap