Tests with faster memory will come a little later, for now the only one I have is this Games : Silent Hill 2 - 0:09 CYBERPUNK 2077 - 1:01 Starfield - 2:08 Forza Horizon 5 - 3:05 Star Wars Outlaws - 4:05 CS2 - 4:58 Microsoft Flight Simulator - 5:57 Hogwarts Legacy - 6:58 Ghost of Tsushima - 7:55 The Witcher 3 - 8:52 Red Dead Redemption 2 - 10:00 System: Windows 11 Core Ultra 9 285K - bit.ly/4fisOO6 MSI MAG Z890 TOMAHAWK - bit.ly/3NFmC7a Core i9 14900K - bit.ly/3rTFhVy ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy GeForce RTX 4090 24GB - bit.ly/3CSaMCj SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
@@aashritmalik6931 Those power numbers are not telling the real story because the CPU pulls now power in another way. To see what the real difference is one should measure how much power is being pulled from the power socket.
Degradation? It's actually beats the 14900k in workload. The real comparison for this CPU is the 9950x Nobody said that Intel makes gaming CPU. If you buy an Intel just for gaming it's your fault 😂
I think my 12900k is breaking. Damn PC keeps crashing even on a new Windows install. Was gonna see if the new i9 was worth but apparently not. Maybe 14900k will work.
@@i7-1260P I did memtest and the ram modules were fine. Either the chipset or ram controller on the CPU was starting to degrade. Spent $1000 on a Z790 and 14900k because I got fed up with it. Everything is working now and I haven't had anymore freezes or blue screens. It's a shame, I just built that PC in November 2021...
Reducing the lithography size and getting better performance in games are unrelated. However, reducing the lithography size to consume less power-that’s where there’s a connection.
@@zarkha_ When giving less power to the 14900K to align the performance with the 285 Core Ultra in gaming, there is barely any improvement in power efficiency.
@@nossy232323 Well, actually there is, since you have to reduce the power of the i9-14900K to match the consumption of the Core Ultra 285K, so there is a difference in power efficiency, right? lol
@@zarkha_ No. It's like comparing the fuel consumption of two cars driving at different speeds. A Suzuki and BMW. The Suzuki gets better fuel consumption but only because it's driving slower. If the BMW matches the speed of Suzuki and they both get the same fuel consumption, we do not say the Suzuki is more efficient because the BMW had to slow down to the same speed. We say they're the same efficiency at that speed. It's illogical to claim the BMW is less efficient if they're achieving the same efficiency at the same speed.
man i was reading all the hate, but that is noticeable, is a lot of power less, means processor efficiency is better, that all i want when i buy technology, this could mean a really good oc, that is 3nm process, only iphones did that, 45w less with same performance is crazy, could mean a lot on mobile devices, thanks for the comment i didnt notice it at first
Correct me if I'm wrong but apparently this latest gen of Intel CPU can use UP TO 50 watts via the 24 pin connector. I'd be surprised if HwInfo isn't just pulling the CPU power from the EPS cable. Sneaky if you ask me.
@@rafatejera329 Adding on to Franki's comment, this isn't factoring in the whole power draw. Once factoring in the power usage with the cable as well, power usage is comparable to AMD's 9xxx chips and a significant drop since 14th gen intel. Yes, it draws less than last gen but it's looking like nothing more than a power and temperature catch up. As others are noting/hoping, good 'reset' and hopefully this is a Zen 1 moment.
basically almost all chip now from phone to dekstop and server made by TSMC😅, and i can see x86 is on the way out, even 3nm x86 is no way close to apple silicon (arm)
@@pcrepairshop6799 Your comparison is stupid. X86 can match ARM easily and also performs better. Apple silicon chips don't have the magic of ARM but the magic of Nuvia engineers.
@@syedawishah ooow so thats what this video was all about....multi working heavy load...now it all makes sense i thought it was to show gaming performance....silly me
@@syedawishah not true, since the windows patch non x3d chips can match the 14700k/14900k, and are the same or faster in productivity. 0 reason to buy intel
@@Qelyn Your right they don't need to, but AMD didn't know how bad the 285K was going to be when they made the 9800x3D. AMD always assumes that intel is going to do what they claim so they worked hard on improving the frequency of the x3d chips. They have already announced it and set a release date so they can't back out now!
@@sirgwaine2695 Given the Intel bad performance they may not release it. Milking money on 7800x3d for gaming and 9000x for productivity. I am sure they gonna skip 7 Nov date😅
What's the point? They could simply reduce the core frequency of the i9 14900K by 300MHz, which would significantly reduce the voltage, which led to a decrease in consumption
for the consumer perspective you are right, but from Intel, is a new architecture, with a different design on a 3rd party node, take it as "Ryzen 1st" gen for them... although Ryzen was more exciting
actually if these core ultra with hyperthreading it will be much better but it will be the same way as 14th gen since intel mention about intel core ultra will be able to power efficiency than previous gen, no performance boost is ok but lol you have to replace socket too so literally disappointed
energy efficiency is still bad. 7800x3d uses half the energy of the ultra9 cpu in games and has 20% more fps on average in hardware unboxed, gamers nexus tests...
X3D is 9700x just some cache on top of 9700x cache and call it 3d , just like in nvme storage for more storage you need more nands so... put more on the top of already nand chip and call it 3d nand
X3D only has more cache , you will see the performance only with 4090 1080 low settings , and that is how you play game..... it is a scam believe me i know tech very well , don't trust tech tubers
7800X3D was 7700X and 9800X3D is 9700X. the have what called (Level 1 Cache , Level 2 Cache and Level 3 cache) Level 3 is the last cache and after level 3 cache the cpu asks for ram that is why ram speed matters. like i say 9800X3D is 9700X why.... becuase 9700X has 32mb level 3 cache and amd puts the same amont of more cache on top of that then it is 64mb and now we got 9800X3D. so when does this cache matter = RTX 4090 + 1080p + lowest settings to quickly push more frames .. just a fucking scam
@@alejoyugar If you clock the 14900K a bit lower and give it less power until you get the gaming performance of the 285 Core Ultra, I doubt there would be much difference in power consumption.
@@BlackWGame just adjust the voltaje in the BIOS with the CPU lite load option reducing the number, if the voltaje is less than 1.299 your cpu Will be fine.
Please add Warhammer Space Marines 2 to your tests! Thank you for a proper comparison, but in several games (Starfield for example) you need to set 720p cause GPU load is 96+% for both CPUs. Waiting for 1080p with a new 5090 card.
I really don't know what Intel was thinking with this one, they should have done a 14xx series refresh and just add an 'X' or something to the name so customers know it is the new and improved 14 series without the stability issues. Would make a lot more sense than releasing a far slower new range with a pointless new naming scheme
@@rafa2657 A new socket does nothing for consumers though, it just means a new CPU requires an entire platform upgrade, unlike AM4 and AM5. So it is just another negative for customers
@@MaTtRoSiTy yeah but that's what I'm saying. Who is going to updrade your 12th 13th 14th for this 15th gen, considering they need to change their CPU. They should have done what you said and still keeped the old socket.
I’m hoping I should be okay as I only built my PC in June and soon after they released micro code updates. I’ve installed the latest update and thankfully haven’t had any issues with stability.
What I don't like about the 285k is the P-Cores only have eight threads. I'm sure some of these thread-gobbling games will take a small performance hit with this processor. I don't see this processor aging very well. Also, the power figure is tricky because these processors like to suck down more power from the 24-pin connector, which throws off the reported power figures in the overlay.
How the fuck can anyone love a Tech Company that's anti consumer and does not want to progress their technology for the greater good but only for Money. Don't forget the time Intel stagnated CPU performance because they haven't had competition but hey guys, new architecture, new chip set you need to buy every time even tho it's only 5% higher clock every single time. And I'm currently using intel 13th gen and 14th gen was a nostalgic move from them as well as all the f-ing problems that came from that.
@@ESKATEUKhow is it a joke? It’s a refresh of 13th gen. Same as amd with ryzen 9000. 12th gen and 13th gen were a huge improvement. “nothing released in years” average amd fanboy
@@XFXGX an average AMD fanboy who has never owned and AMD cpu or gpu in his life 😂 I’ve always owned intel and nvidia. I’m just not a biased fool like yourself.
@@ESKATEUK I’m not biased I just look at the performance, product over brand, it’s objective their 12 and 13th gen were big performance improvements. it’s not true that they haven’t released anything good in a while, years ago they were stagnating. 15th gen isn’t much though, so far at least. Probably cause they’re nerfed in clock speed. 500mhz slower
Out of curiosity, If it's possible I'd like to see the performance of these new Core Ultra CPUs with the E-cores disabled, I don't think these CPUs will fair very well without having to offload onto the E-cores, (assuming games will even do that) not having Hyper-Threading was a huge mistake in my opinion, that's the main reason why many old CPUs are still viable today.
I wonder how they'd compare once 285K has 5.7GHz all core and matched ring bus clocks. Seems they're slowing the chip down. Need to see more OC comparisons
Excellent move by Microsoft and Intel. Release patches for PCs so that all 13-14 generation processors start to malfunction. Raise hype by scaring people. Now everyone will be afraid to take 13-14 generation and buy only new CPUs, although in fact these are the same 13-14 only with HT disabled and cut frequencies. What can I say, beautiful. The guys from Intel clearly understood that they would not be able to jump above their heads. Bravo!
Thank you fot this video! Could you do a test another time with the 285k OC 5.7ghz without the E-cores or with E-cores to see if there is any difference?
Fake channel with 500k follower's. Unbelievable. Both CPU Temperatures are under 50 degrees, really? In all of his videos...Where are the validation for his testing system?
must say that 285K is interesting one... im so curios what went wrong 1. software sending work to E cores instead of P cores ? 2. the new tile design have bad latency ? 3. new architecture just bad for gaming ? I guess its combination of 2 or even all 3.
Intel really went "Fine, I'll do it myself" when it comes to burying itself in this whole Intel vs AMD thing. And all they had to do was, not fuck up for like 3 times in a row. Can't make this shit up man...
Sooo... when we are limited by the GPU, they are equal, but once we actually compare them in a less GPU intensive gaming scenario, the 14900k is considerably faster.
The results are not that disgusting vs other videos. It will be more interesting to see the performance of 285 in brand new games when developers can take advantage of the new architecture maybe.
Hahaha are u okay bro? :D I was Intel fanboy until this garbage, i own 13700K. Who da fug will care about 30W difference in gaming, that's nothing. I want and i expect better performance from newer cpu. The worst platform and cpu release in cpu history i think.
This is their first version of chiplets or in their term, tiles. Yes, it is bad, I think they have problem with the latency between the tiles. In all honesty, I expect a cost reduction due to chiplet design but the pricing isn't great. Unless you are using it for productions or a hardcore fanboy, no reason to get this generation intel.
Every single test (except Starfield), the 14900K was like 3-5% faster, and used 25% more power. I'm wondering if the 285K, when up-clocked, may beat the 14900K, but Intel just wanted to sell the power efficiency.
Arrow lake is a definite regression , but its not 100% bad , the power comsumption is much lower , and in real world gaming performance, the difference will be minimal. What intel should do is lower the price immediately, so atleast it will sell some units. You are still better buying this than an unpredictable cpu like the 14900k which degrades after a year. I am more excited to see the 9950x 3d , it is alleged to have 100mb vcache for each ccd so it should have a total of 200mb vcache. That will make it a gaming and productivity monster
Normally, power efficient is nice. That probably mean it has better stability with cheaper part. Tho I can imagine company to just make newer motherboard as expensive as last model and make it support even lower wattage
I feel like in a year or 2 it would be good since this new architecture is made from scratch. Windows drivers need to gets patched for the cpu and motherboard drivers need to mature. Ryzen in 2009 was the same, they had to mature the bios and get better driver support on windows. How the tables have turned table.
if the power draw is a selling point for you just get ryzen, they're still nearly 40-100% more efficient in gaming than this new 285k and faster. intel is a joke
@@Definedd I even use a negative PBO of 20 and now my Ryzen 5 5500 uses 41w max under load and in gaming only use 12-15w of power. That is with still the same default clock speed of 4.25GHz.
@@iikatinggangsengii2471 I remember when 12900k was barely any better then 11900k, wich was a massive letdown on the first place (on some scenarios, the 11900k was worse then 10900k, mainly due to being 8/16 instead of 10/20 on 10900k). Then the 13900k came out and it blew 12900k out of the water.
Had 4x14900K and 1x14900KS. 5 different motherboards, many sets of RAM, custom loop, Noctua and various AIOs. Spent more time tuning the systems and trying to bring the temps down _at stock_, tuning RAM etc than actually gaming or doing anything sensible on it. Those CPUs should never have been released. Over-overclocked from the factory with too high, even dangerous voltages and power. Running at borderline acceptable on custom loop from the factory when running at 100%. I don't want to even mention 14900KS...WTF were they even thinking? Maybe good for LN2, cascade or chilled water loop only. In a way 285K sounds fresh, back to the old ways, can be cooled on high end air cooler even if gaming performance doesn't match, something like what 13900K/14900K should have been. But I also have a 7950X3D system already and it's beating the 285K.
please make video with VLDR disable (bypass), to compare with stock regulated power draw. in this gen they added linear regulator to the cpu lol😅 edit: de8baur en makes video about this.
I wonder if you could just undervolt the i9 or maybe slightly downclock it too to get same power consumption numbers and still not perform worse than the Core Ultra
You people need to start appreciating Intel for doing something about their new CPUs, better power efficiency, overall less power usage, easier to cool, no voltage spiking ike on 13th and 14th gen, etc. Give them some time to improve their new manufacturing process and make newer CPUs faster and more efficient. You already know what kind of a disaster it was with 13th and 14th gen CPUs and now when Intel brings new CPUs that aren't that power hungry anymore, you complain again. I agree that prices suck, but that's normal for new products. Wait for a while until their prices come down, otherwise, get yourself an AMD CPU and rock that
Hey! Don't be so harsh on it, guys! At least the Ultra 9 has lower power consumption, isn't that something? RIGHT?! Still ridiculously higher than 7800X3D tho--