If you really want to piss Intel off- combine the old and new naming schemes together- i9-15285k. Much easier to pronounce & the "285k" end makes it seem like a low-end, wacky i3 variant!
@@benwhite6786 The most annoying part is they could have just called them Core 9 290K, Core 7 270K and Core 5 250K. But instead they decided to take 5 off for no apparent reason.
Wild guess: some Sony full frame camera with a wireless mic inside his shirt, reasons: those colors look really Sony like, and the audio nowadays can get crazy good with lavalieres some really nice ones can deliver 32bits audio even wireless, that could explain the good noise reduction and even having a phone recording general noise to do the noise reduction, for the picture I don't really see some kind of strong light pointing at him, maybe a phone with the flashlight on
I love the sudden push for efficiency. The focus has been on power for ages, we can do with one intermittent gen where the efficiency boost carries to future gens.
My instinct is that this CPU generation respects that gamers simply don't demand a volume that matters for a modern PC market where decent CPUs tend to do good enough for a minimum of 5 years. Both the volume and margins probably exist with laptops and bulk commercial desktops where performance has long been over the required threshold, but efficiency would make a significant marketable difference
@@LoFiAxolotl Even if efficiency is what wins this generation, AMD takes the crown. I'm not sure there is a "huge win for intel" in the foreseeable future.
That's not what's happening though. This is a stop-gap before they get their new ASML machines and chip fab are up and running. It should be another 12-18 months before that even spins up though. They have a whole new architecture planned and coming, but they can only optimize on their current designs right now. AMD is going to be the clear winner for at least another 1-2 years as TSMC already has their ASML machines in their fab, spinning up new AMD silicon.
It does feel like they saw power efficiency as one of the biggest factors for Apple Silicon's success and they're trying to bridge that gap. I'm not too down on this since focusing on power efficiency now means less heat which can lead to a higher ceiling on future chip designs.
@@HerbaMachina It depends, if you don't care about gaming and only care about productivity, then it's better to look at both Intel and AMD higher offerings.
This generation feels like it’s for OEMs and not enthusiasts. I’m sure Dell, HP, etc. will love that it’s easy to cool either AMD and Intel CPUs while still getting reasonable performance. With that said, I don’t think I’d ever choose any of the Intel 2xxx series over the AMD options.
We could stop processor development for the next 5 years, and it wouldnt be enough time for game developers to catch up on optimising and learning how to make games efficiently again and take advantage of where we are at now.
It seems like most AAA devs these days just hope that upscalers can pick up their optimization slack. I think that side of the industry is long due for a big shakeup.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
If that's your priority, then AMD is the way to go, and it's going to be cheaper to upgrade at the long run thanks to the longer lifespan of its AM5 socket
You don't save on energy though. At NA energy prices you'd need 10 years of continuous heavy usage to recuperate the price difference to 14th gen CPUs alone (nevermind motherboard cost), even at European price levels, it would still take 5 years. And that is vs Intel 14th gen. Realistically you won't save anything in energy bills and will moreso lose money because it's more expensive. AMD just blows it out of the water regarding power draw and efficiency as well, so not even a relative win.
We spent years demanding better efficiency and making fun of these companies for making chips so inefficient and hot. They finally prioritize efficiency and we go to making fun of them for not boosting performance...
tbh all those upper mid-range CPUs from last 3 years are nearly within margin of error when gaming at 1440p and beyond. It's all about GPU anyway. So yes, I'm happy there is no higher focus in deficiency. These 5 or 10% perf bumps at extra 100watts of power usage was getting ridiculous.
@@morgan40654 just glossing over the fact that AMD have set the benchmark now for what a 60W chip can do whilst Intel is sucking back 120W minimum for LESS performance. That's why people are pointing it out. Apple have solved this. AMD have solved this. Both of them top the charts whilst pulling tiny power draw numbers. Intel are embarrassing themselves. Stop being a fan boy.
In addition to being a launch pad for future generations with brand new architecture with a strong pivot away from the high power low efficiency of i9s and i7s past. This is good for the future, even if it doesn’t look that great today.
@@rmiah yeah just got ryzen 7900x and yeah it may get 10 fps less compared with intel but also getting less electricity consumption... but I'm not changing that cpu for a while no overheating clocking 5.4 ghz at 1.25 vcore and maxing at 70 degress celcius. 😜
@@peterpanini96 AMD has been way ahead in the efficiency game.. Intel is finally cluing in that people don't want to pay an extra $100 a year on their power bill for a few more FPS than the competitor.
@rmiah but why choose Intel? The problem isn't what they are trying but that AMD is on every step better. More performance and cheaper CPUs the Mainboard are for sure wayy cheaper then the new Intel boards, the Brand Intel is absolutely not what it was just after the last incident showing terrible support at times, especially for knowing the problem for atleast a year.
@@ehrlichgesagt863 why did people choose AMD when Ryzen 1 dropped and it was marginally worse if not equal to Intel’s offerings (ignoring pricing)? Competition is good in the space no matter what your preconceived notions about each brand is. And as later tests in the video showed, Intel’s new architecture still holds its own for productivity and AI tasks.
Nowadays, AMD is the best choice for anyone because the latest AMD CPUs use the AM5 socket which will still get some years of support in the upcoming years. Back in the 2010s professionals would recommend intel because of reliability, but this is absolutely not a issue anymore. AMD has become the better one out of the two giants, with of course significantly lower prices.
I kind of wonder what the best way forward is for Intel. When 12th gen came out, it was a huge performance improvement but most of the discussion was around how hot they get and how the i9 was almost impossible to max out before throttling. Now they focus on improving efficiency by crazy margins, and all we want to talk about is how there was a slight performance hit.
You're forgetting about 2 generations of CPUs that were only mildly better but ran even hotter. Like, I think the way to think of this is that they're overcorrecting on an issue they created themselves.
@@bagofsunshine3916 Ryzen 9000 pretty notoriously barely improved either, though. This generation kind of just seems like neither manufacturer is going after the enthusiast segment.
I like this minimum production out in the public video, reminds me of the first video I watched of Linus over a decade ago (overview of the MSI p55-GD65)
yeah because their reviews popped up before LTT's video ao they added that,but gamers Nexus's review just popped around the same time as their upload. Not that i expected them to mention them cuz of their beef but there's that.
Even at super expensive energy prices, the price difference to even the inefficient Intel 14th gen CPUs will take years to recuperate and that is before you take into account motherboard cost. And the AMD CPUs draw way less power and don't need new motherboards.
@@MajinOthinus I'm aware I'm using a 5800x3D myself and bought a RTX 4070 specifically for its low power draw undervolted. I just think that it's generally good if technology uses less power.
Don't forget everyone, you need to price the chips up and not just use FPS comparison charts. The charts are nice and all, but if (for example) it shows a chip that is fourth down from the top that is half the price of the one that is third down (i.e. faster), that makes a big difference.
I’m actually glad to see a clear return to Tick-Tock type roadmaps. Not every release needs to be exciting. Refinement releases are valuable, too. This product is a big pivot for Intel; it was never going to be an absolute win. This is basically their Zen 1 moment
Hot take: I don't necessarily think it's a bad thing if we're reaching a performance plateau for CPUs in general. It might be nice to shift the goals from just raw compute performance to efficiency and optimizations. "I couldn't make it faster, so I made it cheaper" should be a valid business strategy.
13:04 Have you noticed any disparity between the software reported CPU package power, and actually measuring power consumption at the hardware level? I know it's not possible to isolate just the CPU's power, but you can isolate power consumption to the motherboard by using hardware interposers like how GN is doing it now (which includes CPU, RAM etc) and measure that. It means you should have far more trustworthy numbers compared to how Intel and AMD self-report power consumption at the software level. They also use different methods internally for reporting power, which can lead to apparent power consumption differences between vendors.
Can I be an optimist and ask the following question, maybe they are both (AMD and Intel) doing major power budget cuts in the current release to sky rocket clock speeds on the next gen? I am probably using the wrong terminology but I think most will get what I mean.
I think I agree. This is probably for Intel’s 1,8nm chip next gen processor to make a big splash and also for their GPUs that are supposed to come out soon
I’m pretty sure both have said that somewhere. This is the foundation for the next generation and with all the headroom they can really push performance.
This is probably the direction the world doesn't want to see Intel go in, but from Intel's point of view, probably catering to the smaller audience of efficiency in both performance and power draw. Don't get me wrong, I'm all for getting a little greener, but did we all forget how stupid it was to run the 13 and 14900K? Now we don't have to worry about racking our brains on the most optimal or the bare minimum cooler to slap on the newer chips. But hey, no one is forcing you to buy the new core 200 series. I wouldn't even hold it against ex-team blue bros, because this is definitely a day to lose
I would’ve love it if you guys could do a Overclocking for dummies series. I’ve tried so many times to do research and figure out how to do it but I get scared when I open all these forums and have no idea what they’re talking about. Regardless keep up the great work
Honestly I’m pretty happy with this generation. I probably won’t be buying it, but it’s the first big architectural change in quite a while, and that’s a good thing. And it’s not exactly a complete dud, seeing the 285k score very close to the 9950x in productivity is pretty good all things considered. Gaming performance is pretty disappointing but time might come with future optimizations like with Ryzen 9000 The present is sort of gloomy all around in the CPU market but I think there’s a bright future for Intel
I wish you would add a compile time benchmark to your productivity suite. I understand Chromium or Linux takes an extremely long time to compile for each configuration, but I can compile ReactOS in about ten minutes on an Intel i5-9400 under Windows. It might be a big enough project to test without eating a ton of review time.
@@sharzo7728 They did compare the power consumption to the 7800X3D and the 9950X, atleast for productivity and really didn't hide the fact that these chips pull less across the board. I don't know what you are looking for.
@@1Percentt1 In gaming it would worth buying a 7800x3d but in productivity the u9 285k beats it all across the boards, and neck on neck against 9-9950x. But 9-9950x can draw up to194-200 watts while u9 285k uses fraction of that wattage. But I can't say anything about the new 9000x3d CPUs which has not released yet. conclusion: intel u9 285k is a fast but lower power CPU with AI capabilities. Hopefully Intel can use this and create an CPU that's more performant for games rather than production.
@jeppe1774 did you watch the full video and read my comment? My comment was talking about gaming. But even then, the u9 was pulling 220w against the 9950x's 200w. It's not using a fraction of the power, it's using more power. And the 9950x is also faster at gaming.
The naming scheme still works the same way. It just has the word "Ultra" probably to emphasize the new NPU. This is the 2nd generation of Ultra processors, hence the 2xx. The K and F suffixes still mean the same thing, the tiers are the same, and the xx part works the same way still. I find it hilarious how people see a different word and their brains turn off.
tbh for gaming these can handle pretty much any gpu till now and i dont think i want my pc to burn down for a "better" performance which i dont need ( for cpu intensive tasks i really dont know)
when my 13700K started acting up with the now well known intel-gate of 2024, i switched to AMD. the 7800X3D to be precise. my gaming performance significantly increased. only my synthetic benchmarks decreased. but real world performance you could say ''doubled'' (as i had to downclock the 13700K so much for stability it should no longer be considered a 13700k) another added bonus? the POWER BILL. the wattage my PC draws is significantly less. the 13700k would happily draw 200+ in ANY game/activity other than being a desktop youtube sim (and even then it would easily go to 100-150 watt) the 7800x3d? 78-82 watts. yes. thats full power, with beter gaming performance across the board. being a grown adult, who has bills to pay, while energy prices are doubling yearly, yes this is significant.
@@theroyalcam not OP but I can tell you that crashes for me are way more likely to be caused by my gpu driver or the game itself rather than the system having a complete meltdown due to the CPU. However trying to be as unbiased as possible, the YMMV part comes more from the motherboard manufacturer and how mature their BIOS's are, I'm using a gigabyte aorus master and I've had to rollback before due to bios bugs.
@@theroyalcam You probably won't experience any crashing or stuttering issues after switching because of the cpu, if it occurs it is most likely caused by the ram, motherboard or other components.
@theroyalcam nope, none at all. Same GPU. Had to switch motherboard course. And ram due to compatability. Same ssds, sata and nvme. No stability issues, no crashed, no blue screens. On demand performance for so little power.
@@avenage when diagnosing the horrendous stability I checked EVERYTHING. Ram, GPU, drivers, SSDs, software. To no avail. Clean installs multiple times. The only thing that helped was to downclock the 13700k to what you could say was 13500 performance levels...
Intel Ultra 9 285K or the upcoming AMD flagship CPU FOR MUSIC PRODUCTION (DAW) I'm talking big projects, professional level. DAW support says that the program mainly needs Single Thread Performance and on the paper Intel Ultra 9 285K is in the 1st place right now with score 5132
@@KYLE-zo4bm A year its hundreds of dollars of Average use. A 4090 alone can suck away 600+ dollars a year easily with just an hour or two of gaming a day... So no. Not just some pathetic 5$. Its thousands of dollars in a few years. 5years and you could have bought a pretty nice older luxury car from that electricity you paid for...
I don't mind lower power draw or 'better power efficiency' at all... Being from Europe where electricity is a significant cost I'm basically waiting for a more efficient 7800X3D/RTX4080Super equivalent from the new generations so I can finally upgrade from my 3700X/5700XT. Performance will be a huge boost anyway, I just wanna be able to afford to turn on my computer 😅
Might be a strange take but I love the direction Intel is going now. Finally a chip I want to upgrade to from the 9900K, I would get much less consumption and to be honest never had much of a CPU bottleneck anyways in the last few years.
honestly.... i'm not CPU bound in any game or really any application i use anymore (DaVinci, Photoshop and LrC) so i don't really care about more performance at this point... but saving a ton over the year on energy cost i am interested in so these CPU actually do look interesting to me even if i lose a few FPS
But if you want power efficiency why not go for AMD? You're going to have to buy a new Motherboard either way so why not make the jump to AMD and ensure the next 2 generations can be ran on that same Motherboard.
What about the cost of potentially replacing your CPU every 2 years? Does everyone have the memory of a goldfish??? Intel is not getting away with the shenanigans from the last 3 years. Intel really made it easy for me to stop being a fanboy with their one simple trick.
How much is that going to saved you though? If their power consumption only differs like 30W, for the price the 13th and 14th gen it's still better value if you only use it for 3 or so years
I am really happy they are focusing on the poer efficiency, it's exactly what I was waiting for. I don't care that it's not that much faster than last gen, it's drawing way less power which is very important to improve innovations in the future.
Loved the review and the breathing bubbles. I would suggest that beside the bubble, the intel processor should have blue bars instead of all of them being red. (my eyes keep missing the rows when comparing 😅)
When someone you considered a friend puts out a hit piece on you right after your sister dies, its completely understandable to not even acknowledge their existence anymore.
I feel this way too, and I was looking for someone else in the comments who said so. I don't think I can tell the difference between 226 fps and 197 fps, for example, so if that comes with a 40% reduction in power, why on earth is that not a good thing? I imagine I am not tech-savvy enough to know about the fringe cases where that kind of thing really matters.
Filming the intro to this in an airport is wilddd, also the naming is ridiculous, at least the previous iX-ABCDY kinda made sense if you put the effort it
I started saying a few years ago that we've probably reached peak technology (or close to it). There are a number of reasons, but *real* innovation is coming to an end. We've been seeing tweaking of OLD technology for decades. The last really new thing was 3d printing, which I first recall reading about around 1990 or so. What new thing is coming? Again, there are many reasons for this. Be happy for any improvements, and energy efficiency improvements really does need to be the focus.
this generation is not super interesting for intel, but with the lower power draw like they used to do every odd numbered cpu gen i'm super interested in how they'll push performance next generation
I am honestly fine with the focus on using less energy this generation. While I love some brute performance. I also like my air cooling. If it's priced like how it performs, it seems like a good chip for people like me.
I think people are too focused on FPS and performance, and its not sustainable in the long run to keep wanting more FPS rather than other things in a CPU. A more efficient CPU will also result in better cooling I imagine, so we wont need such expensive coolers...
People need to realize that we are up against a wall with power consumption and ARM is making big strides. If X86 doesn't get its power consumption down we will start to head backwards.
The biggest reason I can think to buy this gen though is the PCIE lanes. The platform supports up to 44 PCIE lanes which is a lot more than you can get from prior generations or the competition. This thing actually has some expandability in it that you used to have to go to threadripper or xeon/epyc to get.
I have an i7 6800k, going to Arrow Lake would be like going from a Vespa scooter to an Audi RS5. I do more than gaming, actually a lot less gaming, and more AI related tasks. This is also the start of a new platform and has an upgrade path for years to come. Will I buy this right now, no, do not have the money right now, but maybe next year or so.
sigh, so frustrating all reviewers bash on the efficiency, as a gamer I don't care about FPS increase because the reality is, you aren't getting any. If you're getting a high end CPU like 285K you're going to have a 7900XTX or RTX 4080/4090, and really despite reviewers constant focus on 1080p low performance, no one with those GPU's are going to do that, you're going to be running 1440P high/ultra min, if not 4k and maybe with RT on, and at that point the FPS is effectively going to be the same, so efficiency is really the only metric that really matters.
This is really cool for gaming laptops tho! Because when you buy gaming laptops you can never push your hardware to the limit due to thermal throttling. So hopefully future gen laptops will be efficient enough for it to not thermal throttle... I feel it's kinda scummy how they advertise performance on gaming laptops with high end gpu cpu when their cooling ability is way to low for it to be ever hit those peaks for a long duration of a time.
I've been an AMD fanboy for years but IMO this latest move is really good for long-term viability. Dumping out of multithreading is going to be a performance hit but will really improve security and prevent the need for patch against spectre/meltdown-type behavior. The AMD processors might even wind up slower than these Arrow Lake CPUs after future patches.
FWIW, Power efficiency actually does matter to me, a lot. In Northern California they’ve jacked out electrical prices wayyyyy up. I pay $0.43/kWh during the day and $0.57/kWh from 5-8PM. So even a 40W reduction on a machine I leave on 24/7 (I do because I run containers and other home lab workloads on it) would save me $150.7/year. So if it costs me $200 to upgrade, selling what I have, I save money. This is my plan for servers particularly. It costs me several hundred dollars per year to run my current E5-2680 v4 TrueNAS server.
[More context in my other comments] I agree with your final recommendation. Still, I hoped you would discuss the marketing claims more because it covers all the awesome technical stuff. There is a lot of new stuff here. New node, architecture, core types, EMIB, and more. From a purely technical perspective, these chips are really cool, and addressing that in depth is a good context for your results. If you've already made a video on that, my bad, plesae reference it. If you plan on making one, please let us know so we can keep an eye out for that. If you don't, I hope you will consider doing one.
Other people don't care about power efficiency? That's like one of the first things I look at because then I know how well designed it is and how close to the edge it is running.
Efficiency doesn’t mean any of that. If you design something to run at 90% and it runs at 90% then it’s well designed and where it’s supposed to be operating.
@@benstanfill363 well to use the tired car analogy, would you buy the new Honda Civic that somehow gets 8 miles per gallon and produces 160 horsepower? Or would you buy that new Toyota Corolla which gets 60 miles per gallon and produces 160 horsepower?
Hate to be a contrarian (actually, I don't), but I actually spent a not insignificant amount of time on my most recent CPU trying to pick the best balance between performance and efficiency.
Not really powerful CPUs but a large reduction in power consumption? These are CPUs for laptops then! Gamers generally don't really care about power consumption for desktops.