Massively incorrect. Its more like 15%. Look closely, the 4090 is SO under utilized its auto down clocking to sometimes 1000mhz. 30-40% useage @ 1000mhz is like 10-15% at 2700mhz.
This video is a perfect example of why you should never just go by average fps on a graph, those frametimes are brutal, as expected though from an 11 year old quad core and without hyperthreading, which nowadays can still make quite abit of difference, which is one of the main reasons why my old 4790k is still very capable in alot games, and even more so if you can get a decent overclock on it, and good RAM.
ya frametime spikes was the main reason I finally upgraded from an i7 920 oc'd 3.5Ghz (4 core / 8 threads), although the intensity of the stutter was greatly reduced with the overclock the lack of certain CPU instructions was still bogging newer games down, CPU held on for a long time (2009 - 2017) and surprisingly so did the 1st generation i7 Gigabyte motherboard that it was installed on.
@@faultyservice Yup, when you're thread limited, you have to cap your frame rate or it'll stutter like crazy. It doesn't show in the video as the recording is local meaning it lags as well causing it to skip time.
I find this being on my recommended list kinda funny because I had been mentioning that my current pc is using an i5-3470(it's an old hand-me-down). Never did I think I would ever see this specific cpu pop up in a video like this.
The 3rd Gen Intel Core seems still powerful enough, from my perspective as basic users. I remembered when using Desktop computer that used it (It was i3 but I forgot what's SKU name) and for my usual task is still good enough though. Such as web surfing, make documents, create basic contents, etc Computer that I said above was Library's Computer on my High School
Everyone is all mindblown but failing to account for the sandbagging Intel did, AND the fact that 11 years of advancement in tech SHOULD yield insane results. It's no shocker.
This 3470 seems to be slightly OCed, it's running at 3.6GHz, it's for when using only 1 or 2 cores frequency, for 4 cores the boost frequency should be 3.4GHz. That's what the specs says and also how my old 3470 used to run demanding games, at 3.4GHz.
the most serious thing is the time frame which fluctuates a lot this causes enormous stuttering problems that even if you have 60fps you still don't enjoy the games because there are micro shots. I think this is the biggest problem with having an old configuration with a new video card. I want to know what you think and if you have solutions? . I tried to make these shots less frequent by setting a frame limit depending on the game with rivatuner. do you have better solutions to avoid stuttering? Thanks in advance
ik the test is about CPU bootlenecking, but put the i5 3470 in 4k, and/or with upscalings on. No1 is going to play with a 4090 in 1080p.. Soon I will do i5 3470 + 4070 is test myself. But I already bet that the FPS per core/thread/cache - rate will increase. The GPU will take care of the games
I5 3470+4070 + 8gb ram in DooM Eternal = 75-90 FPS in 4k with all but shadows/reflex in ultra, DLSS Quality, RT on. No RT and no Vscyn/frame limiter = 190~200 FPS kk
A cpu as weak as the old i5 was never meant to run 32gb of ram, it's hobbled before it starts, 12gb is max at stock speeds. Then you could use V sync which would definitely make even PUBG playable. I did a test with a stock i5 4570 about 2 months ago on PUBG with a GTX1070 on low settings and it holds 60fps for the most part no problem while recording the game from that machine. I know the Haswell has about 10% more IPC but still. Whatever way you look at it gaming rigs were running 8gb in the main 11 years ago and the i5 here would be much better with that amount of memory than trying to read and write to 32gb worth. Even 16gb would slow that i5 down, I know because I've done the tests and run the benches. You'll be lucky if the i5 3470 is reading and writing 10gb/s where it should be between 15 and 20 at stock with sensible amount of RAM. On the edge RAM makes a huge difference. 11 years ago people were happy to be averaging 40fps some of the games were doing that here with no problem. I just loaded PUBG on the machine on the bench, a 10 year old i5 4670K overclocked to 4.5ghz with 16gb of RAM 1600 and the GTX 1070 overclocked. 1080p Ultra settings, V sync on, 59fps average, 40 - 45fps 1% low and 18 - 25fps 0.1% low, perfectly playable. I've run overclocked machines daily since 2007 and I used to run this very CPU daily for 3 years at 4.3ghz with 2400mhz ram. I've benched it at 4.750ghz and daily now with all the energy saving settings on at 4.5 as I'm better at overclocking Haswell now. I put the memory at 1600 with loose timings to make it fair for generic RAM.
youre dumb ..any old i5 can run 32 gb just fine. the RAM speed is what you need to care about, it depend on the cpu/mobo capabilities (cpu imc and mobo chipset)
an i5 13400f and 4090 together can only manage 50+ fps in Hogwarts? And also wtf is that 3470 performance on PUBG? Back when everyone plays PUBG, I used to play it with a 3570k (non-oc) and GTX 970 without any major issues. I have heard that it's much more optimised now, but what is that stutters?
The lack is all due to the powerful graphics card, which puts a lot of pressure on the processor, especially after the graphics settings have been reduced. I think the best card to deal with this processor and avoid the lack is GTX 950
@Waseem that's the whole point, to provide the best possible GPU for both CPUs if we want to measure raw power of these CPUs. If you test those two CPUs with GTX950 then this whole comparison doesn't even makes sense.
@@PhantomNovelist I'm not speaking about the comparison or the video I'm just explaining where the (lack) come from Because as you know, when the GPU is compressed 100%, it does not cause freezes, unlike the processor. This is the point I was making, because some people do not know that the processor must be stronger than the GPU. 👍🏼
Hmm, yes. I bought rtx3060 and pair it with my old i5 2400. For the first test game I launch Miles Morales and it runs almost well on high preset (amount of cars and people on min) without rt. It can get you 80 fps in closed rooms, around 60 while fighting at the city and 45-55 while flying on the web
This is a kinda shitty test..... You've got a GPU that the older CPU has no chance of keeping up with. It would be more representative to see the test performed with a video card that isn't bottlenecked by the older CPU.
If you are going to compare two CPUs that are 11 years apart, then you need to have some additional comparisons: A. For modern games, the i5-3470 at low graphics settings vs i5-13400F at ultra settings. That way, we can see what visual "eye candy" has to be lost on the older CPU in order to achieve higher FPS. B. Utilizing representative games that would have been played 11-12 years ago (Elder Scrolls V, Batman: Arkham City, Deus Ex: Human Revolution, The Witcher 2, Borderlands 2, Dishonered, Mass Effect) to see what performance improvement has been made. C. Including benchmarks and games that are CPU intensive (e.g. Ashes of the Singularity).
should i go to 13400f or 7800x3d? im budget oriented i will be fine with 13400f, but if i go 7800x3d there will be no need to upgrade for at least +10 years
Test again with AMD GPU, Nvidia driver tend to have some serious CPU overhead as it make very high amounts of draw call to the CPU, high threaded CPU handle this like nothing perfectly while lower threaded CPU will suffer a lot
While he's at it, he might as well test with 2133MT/s or 2400MT/s RAM and use the inSpectre tool. This will help the 3470 pick up some extra frames. I used to have a 3rd Gen Xeon processor, amongst a bunch of other lga1155 processors, and the Spectre and Meltdown patches hit that series hard. You can pick up a 5% improvement, usually, sometimes lower, sometimes higher.
the fact that the 4090 is barely working (as in barely goes above 50% usage) just to push out 70 fps shows how powerful it is plus the temps never go above 60 degrees people lets not complain about graphics card being overpriced
the problem is the price per FPS is tremendous. the 4090 is way too powerful for desktop gaming anyway, on some 4k screen sure but most people will play at 2k max and it's completely overkill nowadays
I was feeling guilty for buying a new pc, since my 3470 is still alive... then your video came and slapped me in the face lmao Maybe an upgrade to a R7 8700G won't be that useless as I thought... lol
2:07 Weird things like this happen to me in Last Epoch but when I play on i5 12400f +32gb 3600mhz gear1 +rtx 2070super+ssd at 1080p on lock 75fps, it shows me 75fps all the time...is it a CPU issue or maybe internet ? PS. And my CPU usage is at 22%
That was the skippiest 84fps I've ever seen in Forza Horizon 5 and the 1% lows were awful, but Horizon Zero Dawn was even worse. That literally looked like a bad case of lag you'd get back in the day when trying to game online with a crappy internet connection. The Witcher on the other hand, that was destroying even the most powerful computer hardware in its' original form when it first came out and it continues to show you how weak your system is yet again with its' remaster. But running it on that poor i5 was literal murder. :)
@@chase7974 Well, I inform you that it does make a difference, because the one who works is the GPU and not the CPU. Sure, you might not get all the performance out of the GPU, but I'd still get those 60 frames.
@@miguelolavarria6844 You've misunderstood. The resolution in and of itself doesn't have anything to do with it. Your CPU gets taxed when the framerates are high. Gaming at a higher resolution puts more work on the video card - making it harder to generate higher framerates. Lower framerates means less work for the CPU. You still won't reach 60fps, but the GPU usage will be a lot higher.
Perfect example that you Guys dont know what your doing a 4090 is hard bottleneckt By the old Hardware a older grafics Card Like a 1080 would let IT perform way better on the old CPU and could Still max Out Most of the Games that what you do there says absolute nothing about the Power you can get of the older systems Hardware
6600k OC 5.0ghz is still a beast, I run all new games with more than 40 fps with full ultra settings... But in few days it is my birthdate so I decided to invest to a 13600K... Just to enjoy low settings with 800 fps
I was playing most of the new games avg 60-75 FPS except some exceptions with my old i5-7500 and GTX 1060 6GB on 1080p, Skylake series is still not so bad but not perfect when it comes to buttery smooth frame-time latency and %1 Lows just like 4th gen i7s..
A lot of the major stuttering in games wasn't necessarily due to the weak CPU, but more the chipset platform limitations. PCIe 2.0 on the Z77 chipset is really going to cause serious bandwidth issues with modern games & high res textures eating 8GB+ of vram. The CPU is of course holding back the card but the severe hitching I believe is more down to bus bandwidth limitations with the game settings set to ultra. The low thread count (4) would have contributed to hitching in some games as well - running out of threads for a multithreaded game is a FPS disaster, with everything being held up until another thread is available.
Can I pair rtx 3060 ti/ 3070 with i3 10100f ? I currently have this cpu with gt 1030 n I don't wanna upgrade to a med GPU like 16 series. 3060ti/3070 would be future proof n I can upgrade the cpu later and get few more fps. Someone please guide me
If you intend to upgrade the CPU, then starting out with the 3060ti or 3070 would be fine. Not sure where you live but sometimes I'll see a local seller drop a 10700k for about $150. It's worth checking the local ads first because sometimes there are some deals that are way better than what you'd see on ebay.