5GHz? looks like you've completely lost silicon lottery. Mine running 24/7 @5.2GHz 1.375VCore. Custom water loop so it's never over 70C even under 100% load. As i heard 5GHz was possible for every single chip, 5.1 for at least half of them and 5.2 for like 10%. Anything higher was possible but with higher than safe voltages (mine can't even boot Windows @ 5.3GHz 1.4V VCore, it requires at least 1.425 to be somewhat stable and i don't want to push it harder). And it's this 500MHz OC what really makes the difference. In some new games it's a line between under 60 and over 60 FPS, sometimes it pushes 1% lows from like 40 to 60 FPS. It's more than simple 10% boost. Sadly i had to update BIOS to newer version to fit a new set of RAM after one stick misteriously died (never had RAM stick die after 2 years of normal use, no OV/OC, used it's rated XMP profile) and i moved from 4x8 to 4x32Gb set, without update Windows wouldn't see whole memory and could only access half of it. After update there was introduced a bug which affected OC: you have different multipliers for regular loads and AVX2.0 loads and it would set the AVX2.0 multipliers when AVX2.0 load is present (like video encoding etc) and my chip works fine at AVX2.0 5.1GHz but crashes at 5.2. But this bug will always set only to AVX multiplier so i had to set it to 52 and avoid this kind of loads on normal OC profile.
Hey bro, Here in the Philippines the i3 14100f only costs 6000 pesos ($102USD) but it is 2nd hand, If I buy a brand new i5 12400F it costs 7000 pesos ($120USD) what is better to get? And then what is the best budget mother board for those two? Thanks a lot
I would that you should go with 12400f as it is a Lil bit expensive but offer much better performance, as for the motherboard you can go with h610 ddr4 or ddr5 according to your budget. The H610 board should be fine if you are constrained on a budget
I get around 90-110fps in that same spot on a 4090. The 9900K at 40% is nearly using 100% of its physical cores. 8C/16T so half of the theoretical 100% performance is hyperthreading which is not equivalent to the full power of 8 phyiscal cores. Example on my 8700K it was the equivalent of having 1.7 physical cores extra performance acording to benchmarks that show how the CPU scales with HT on vs OFF( CPU-Z has one). On my 13700K the hyperthreading scales better so its the equivalent of having 2.03 phyisical cores of extra performance.
DDR4 3600MHz is pretty close to 6000 DDR 5 because the former has much lower latency. The differences here are the improved single threaded performance of the new architecture and the higher multi threaded performance of the i9 due to core/thread count
RAM Frequency is bullshit, you should alway look for RAM Latency which basically is RAM frequency multiplied by timings. Higher frequency RAM has higher timings also and real latency drops not that much, if you have 2 sets of RAM with same frequency but different timings, lower timings one will provide better results. Sometimes better and slightly tweaked DDR4 3600 can outspeed cheap DDR5 6000 with sky high timings.
В целом, можно сказать следующее: при разгоне до 5,0-5,2 GHz, i9 9900K/KS до сих пор отличный вариант, и им без проблем можно пользоваться еще пару лет для игры на максимальных настройках в Full HD, только нужно хорошее охлаждение.
главное - память. в видео память 3600 мгц на хуниксах (16-19-19-39) и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео. i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
Но тут ненадо забывать что да 9900 хороший камень, но против сильный нагрев в разгоне и потребление большое, и на эту платформу ты ничего уже не поставишь а на 14 ую ты можешь как минимум i5 воткнуть и он то точно делает 9900. Но 9900 хороший камень но на бу досих пор неоправданно дорогой
Двоякая ситуация. В по первых, никто не использует 14100 с 4090, там будут скорее какие-то заглушки PCIe слота вроде 3050/3060/4060, ну 4070 максимум и то случайно. И при 1080р будет постоянный упор в GPU что с 9900, что с 14100. Во вторых, с 4090 мало кто использует 1080р моники, минимум 2К. А при повышении разрешения зависимость от проца сильно снижается и если тут мы наблюдаем отставание 9900К, то я уверен, что в повышенном разрешении будет +- паритет по fps. Так что в большинстве реальных сценариев, если нет какой-то жесткой погони за постоянно топовой производительностью, на 9900К можно сидеть походу еще лет 5, как раз в аккурат до выхода новых убогих консолек)
the 14100F CPU now thinking most of them will be using a pair of 6000mhz ram and RTX 4090 just to keep this fps .. But most in reality the 14100F will be pair with mid or budget parts since it was made for that use in actual cases.
The i3-14100 has higher IPC and use higher speed memory, but the i9-9900K is still twice the CPU cores. So in heavy threaded games, the i9-9900K still outperform it. And will be even better in the future vs the new i3-14100. The i9-9900K is IMO a better buy for the same price.
Something is very off about this: I get similar or even better results (in some games) with my i9 9900k 4800 Ghz, 32 GB 3200 RAM and a RX 6900 XT at 1440p (native ress). In some cases (old single core titles) where is no CPU bottleneck, a i3 1200f can go hand to hand with my 9900k...but once the game is using more than 4 cores and the CPU is bottlenecking, the 9900k will smoke that peewee out of it's existence. A 4 core CPU today is a bottleneck on wheels. Adding to the list the current debacle of 13th and 14th gen, at this point the only reasonable upgrade to a 9900k might be a AMD's 7800x3d. Other than that, not worth it.
Yall this is why you dont need to let mainstream media make you think you need the new platforms. You can grt a budget cpu right now if you so change also and use a 4080 super no problem with it. Mainstream media has people thinking they need the greatest or the bottleneck will effect them. 14400f ans 13400f play on 4080 supers beautiuflly just like a 10900k or 11900k would be. But then again the nee modern budget cpus are 20 cores like the i5 13400f so it does make a difference. Just dont fall for the whole over spec cpu crap. When you have highe end gpus you will be gaming at high resolutions and will notice a verh small difference betwern flagship and budget at those resolutions real world gameplay. People kind of have it backwards. Since budget cpu are so powerful now.
For new generation CPU you will also need new motherboard and set of RAM. I have good motherboard which allows me to OC everything i need, provide stable voltages etc. And i have 128Gb DDR4 3200 (yes, i know 3200 sounds bad but i had to lower timings, it just doesn't want to work @ 3600 even with higher timings) so moving to a new platform is out of question right now.
@ThereWasNoFreeName actually I agree. Just like the 13400f does better on ddr4 3200 then ddr5. People just don't get it. All depends on the cpu. Yoy don't need a 7800x3d and ddr5 to enjoy performance even on a 4080 super.
@@reviewforthetube6485 I don't have enough budget to upgrade anything but graphics card this year. Gotta be moving from 3080ti to 5090 when it hits the stores. And I have a feeling OCed 9900k will be sufficient for few more years. At 4K (I do have 4K display) I already maxing out GPU load while CPU chills and in some games I use DL DSR for even better quality. All I need is stable 60FPS and 9900K delivers, but 3080ti struggles in some games even after mild overclock. Overclocking and tweaking your system provides quite significant boost. Sometimes simply bumping up GPU clock by 100mhz, VRAM 500mhz, CPU by 300-400mhz and lowering timings by 1-2 gets you from stuttery 50ish fps to stable 60FPS.
My previous CPU was 3930K OCed to 4.7GHz and that thing was power hungry beast, for some reason it didn't show load over 255W but under load it always just capped @ 255W. Now that's what i call CPU EATS the power... 9900K is on the diet even being OC.
Run my 9900K with rtx 4070 at 3440x1440 - Play all games CPU Load 30% and eats 45-50wt - it has been doing great job since 2018 and still does! very good cpu!!
reason 9900k is DDR4 platform ,14100 is DDR5 Platform。In some multi-threaded and multi-core greatest optimized games, such as Red Dead Redemption 2, the 14100 on the DDR5 platform is much worse than the 9900K. In some new games, the performance of the two is almost the same.
Thank you for the wonderful video 👍🏻.. I want to buy a powerful laptop for gaming, watching movies, and other work, and its price is reasonable.. What 2024 models do you recommend for me?
the i3 is pcie 4.0 and the i9 is pcie 3.0. So the core i9 forces everything to run at pcie 3.0 speeds. That's the ram, drives, and gpu all throttled simply because the core i9 isn't on the same plane of divine existence.
What I can see, there are games who take better use of higher number of cores and those who don't. BTW this i3 has DDR5 as memory, which is a big improvement...
Andrei - I'm positive you've gotten this question before... but your onscreen stats.... are they set through afterburner? I like the way they look, but I'm finding it difficult getting anything to look like that.... :( What program are you using if it's not Afterburner?
Has anyone ever confirmed this person is legit with the amount of hardware configurations he has? Just wondering before I use his channel results for my decisions
8 cores and 16 threads are still more useful than 4 cores and 8 threads. If it’s just for gaming and a slightly better FPS, then the latest generation of Intel is better. But for work and more, the I9 9900K is the best. I am using an i5 9400k, and because I am reluctant to upgrade all the components, I only upgraded the CPU to an i9 9900k
To make matter worse, this actually implicates the 10700k and 11700k as well all thanks to intel for their amazing generational leap from 9th gen to 11th gen.
Well yeah, intel usually sucks at leaping when it s on same platform, that s why they change platforms quite often, and usually platform leaps are huge, most of them are on average 30-35% better than previous platform latest generation
I suppose. The 11700k isn't that much faster than a 10700k. The Rocket Lake 14nm backport blew chunks. When Steve from GN called the 11900k a waste of sand, he was being nice. For those that don't know, the 9900k and 10700k are pretty much the same thing.... except the 10700k has a higher TDP and it generally overclocks a bit better than the 9900k.
I have the 10700k. Used to be great few years back but I cannot stand its bottlenecking nowadays with the high end 40 series GPUs in cpu intensive games. Especially if you insist on stable 144fps or 165fps.
Ну так нечестно с оперативной памятью 6000 и 3600 сравнивать, тут больший прирост из-за ddr5, а не из-за производительности процессора! Хотя вот в Киберпанке и RDR 9900 показывает себя лучше там где общая нагрузка на процессор выше, что в целом доказывает что он производительнее!)
DDR4 3600 МГц очень близка к 6000 DDR 5, поскольку первая имеет гораздо меньшую задержку. Различия здесь заключаются в улучшенной однопоточной производительности новой архитектуры и более высокой многопоточной производительности i9 из-за количества ядер/потоков
I do understand your point, but who would by the14100f and stick to DDR4? It would really hurt the upgrade path, to eventually get a flagship part and still be on DDR4. This test shows what you can get with an entry level CPU, and decent, not great DDR5. Granted the board used was not entry level. I think this video shows how well the 9900K is holding up nearly 6 years after release.
@@davidandrew6855 brother. it's an entry level cpu. if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. especially on the used market. but that's besides the point. if you want to compare the performance of two cpus you should use the same dram if you could to level the playing field and remove as many variants as you can.
@@davidandrew6855 and that's not just a "decent, not great" ddr5 kit. with a bit of tuning you can get some amazing performance out of it. I got 6200mhz cl30 on a much worse kit.
@@chadfang2267 _"if you're on a budget and you cant even buy an i5 you might as well get ddr4 an dedicate that money you would've spent on ddr5 on a better gpu. ddr4 is dirt cheap these day. "_ I would usually agree with you, but the difference in price between the ram was $30 bucks. I'd much rather have the better RAM then a minimal GPU upgrade. There is money to be saved on the motherboard if it is DDR4, but to me it seems illogical to stick with DDR4 on a new system build, that I might want to upgrade over the years. Again, my thoughts feel free to disagree. As I said I see your point, but in this instance the newer CPU can handle better RAM, why not allow it to actually compete with the 6 year old flagship, vs holding back any performance? Granted we are talking an i3, but if there is more performance to be had on a new system build why not go for it?
@@chadfang2267 I understand the point of tuning memory, I've seen amazing things out of DDR5 5600 with CL28 and it was only about $108 bucks for 32GB. I was just saying what was used in the testing was standard DDR5 with a CL of 38 not even CL30.
в видео память 3600 мгц на хуниксах 16-19-19-39 и она вообще не настроена. ринг наверное 4300. у меня i7-8700k @4.9 ггц, ринг 4.6, и настроенная память 3600 мгц 14-14-14-30 CR1 (выше не едет из-за хламовой материнки) и то производительнее чем то, что в видео. i9-9900k при 5 ггц на ядра, 4.7-4.8 ггц на ринг, с настроенной памятью 4266+ мгц CL16 будет значительно лучше этого i3. и будет лучше даже i5-13400.
Now actually test the CPUs in CPU-bound scenarios instead of cranking everything to the highest settings. Cranking things to ultra is NOT how you test CPUs, you do the OPPOSITE, reduce everything to low. These channels never learn. It's baffling.
1080p is normal even today like EX said but the most used gpus are the 60s Series like 1660, 2060 and the 3060. People like us who spend over thousand of euros for PC parts are the minority, the top 1% of gaming pc users. But I understand what you I mean who would use a 4090 for Full HD. I would like to see a comparison in 1080p, 1440p and 4k to see how the cpu performed.
@@LilianaStar So Esports gamers that play 1080p competitively in world tournaments using 4090s are stupid? Even though 1080p offers them maximum FPS and low latency that effectively improves their gameplay and increases their chances of winning?