CORRECTION: On the slide for CyberPunk 2077 Phantom Liberty, the chart shows 278avg, however that was incorrect. It is 235 avg fps. I mistakenly entered the data from SOTTR into the wrong cell which populated that chart incorrectly. The chart has been updated and will reflect that change in the future. Sorry about that. I am trying hard to add more data points to our tests moving forward and streamline much of the date in excel, which I am a novice at best, so please bare with me. I appreciate your understanding.
maybe you should consider AMD did know about Intels problems with cpu and also knew they could relese a power efficient cpu because Intel are not going anywhere and when they start moving again they got their 3D to counter any Intel cpu - I belive the rewieves are Meh and cant think ahead or back
@@JDubST I don't know that much about cpus. I think i get what youre saying though. Because the GPU can go further with 1080p then it'll allow to display the bottleneck more?
Slight suggestion for some of these charts, like at 7:00 for the handbrake time in seconds to complete, maybe add a "less is better" or "more is better" somewhere so we know if we should be looking at the high numbers or low numbers on each individual chart.
Whenever AMD is low than lower is better when ever AMD is higher than the rest than high is better that's how it works on youtube comments, just go with that.
They did a lot of architectural changes that require software optimization (check interviews with Mike Clark). So, we'll see in time... They did however miss the opportunity to launch with better pricing. Same-ish perf, lower price, better efficiency, and good foundation for future software -- it would be a good generation. Now, it is just a "skip".
@@bending-unit-22you are comparing x3d cpus with x cpus running. wait till they launch the x3d variants before calling the generation "skip" also these zen 5 ones are running at 65W, if you set a higher power limit, you'll see the real performance gains.
The fact that the 9600x beats the 7600x, and the 9600x is also 40 Watts less than the 7600x. That is something truly special. I don’t care if it’s a small margin, it shows that we’re heading in the right direction, especially for power efficiency and that’s what really matters. As far as the CPU numbers in gaming with the 9600x and the 7600x it’s a little buggy right now.
These are consumer grade desktop chips, not laptop chips and certainly not server farm chips. I don't care about efficiency so long as the power draw stays under a level where I can easily dissipate the waste heat. 120w is fine, even 180w is fine. Releasing chips that are 65w is just taking a big L in a market where the customer cares about performance more than they care on saving $10 a year in electricity bills.
@@Taurickk We are talking about power draws in excess of 300 watt +, to make matter even worse the gpu like the 4090s extra are burning up too. I have several friends that are in the hole close to 10k on several new systems because of the this. I have to setup the older system until something better comes along. If Intel goes the whole x86-64 will go with it. All they had to do is add features and be stable but profits come first even if everything is a fig of imagination.
yea dream away my dude.. maybe 9000 will drop in price cuz its too much... 7000 will prob stay up there maaaaybe 5-10% drop... 5000 is still popular and that might see a price cut cuz they need to sell them out
Would be good to have either the unit measured or an indication "lower/higher is better". For example "Blender CPU Test" does not tell me what is measured and what the graph is showing: Is it render time or Is it a score?
All graphs where lower is better(time to complete) are labeled as such. Everything is relative to each other which is really all that matters. 30% faster is 30% faster, whether its 30% higher score or 30% less time to complete.
In most cases, they're measured in points... It's a score. And there's no assuming. Higher is always better unless stated otherwise like with rendering times. They do mention that in the graphs.
The thing that makes me glad is that AMD isn’t using a giant amount of TDP to reach decent numbers, I’m glad that someone in the market is focusing on decreasing power consumption and get the max amount of performance than raising the power consumption to get higher numbers that most of people might not notice if they don’t do an intensive CPU task
Why does that matter though? I'd rather give my cpu as much power as it wants to get as much performance as possible. You don't buy a super fast car and complaing that it doesnt get good gas milege....
@@RatedX_Gaming And yet, power consumption is often used as a reason why idiots blow their money on nVidia cards. Not everyone thinks like you. If power consumption means nothing to you, get Intel. Who cares if they self-destruct? Super fast cars aren't known for their reliability, eh?
I would agree with you if we were talking about laptops, but we're talking about desktops and imagine people buying pre-builts... will a reduce price be seen in less performance?
Performance improvements, substantial ones in some cases for almost half of the power draw seem pretty good to me. And they're finally dominating single core performance as well. Honestly don't get the negativity. It's predecessor was the 7700x (not a x3d chip) and that thing launched at 400$. For non tinkerers and especially SI's this seems like an optimal arrangement. And for enthusiasts its a fun chip, as it has lots of headroom. More reviews should try it with pbo and a slight undervolt.
Don't get the negativity when AMD straight out lied on the benchmarks? Blue screens and crashes? Totally get the fact that Intel gets hammered and rightly so but AMD straight out lying on benchmarks is a pass...very odd
@@vMaxHeadroom intel gets hammered not for their defective cpus, but for how they are handling the situation. I would also argue that buying a defective product is worse, than optimistic benchmarks before product launch. Don't see how that would be 'odd' or even comparable for that matter. Amd didn't lie on the benchmarks, just selected results that made it look better on paper. Pretty standard for everyone. And that's why we have independant reviewers. Still better than what intel did with benchmarks a couple of years ago.
That's what I was thinking, and I think that's the market these chips are really aimed at. I have an older case that's for a media transfer machine that works great for a 95W CPU or less. So this may be the upgrade path for it once they hammer out the BIOS issues and prices drop below release price. The lower TDP means I can convert it from liquid cooled to air cooled.
The funny thing is, running at the same speed while saving 30% power can sometimes be just as beneficial as running 15% faster with the same power consumption.
While your point stands, the charts AMD published are kinda misleading Sometimes all you want is performance, not savings But then again, its not like AMD is alone doing it so 😆
I'll take 100 FPS using 65W over 115 FPS using 105W because I'm not likely to notice the difference in actual gameplay but I will notice the lower electricity bill
@@longjohn526 you mean you notice if your power bill is $200 compared to $203? Don't get me wrong, it adds up over time, but you're literally talking pennies. I'll take better performance everyday over efficiency.
derBauer used PBO to boost the Wattage and got dramatically better benchmarks. It was like a different chip It's just held back on power. Feels like it would be good for ITX builds where you have to use a tiny air cooler
PBO also heavily depends on silicon lottery, like every other overclocking. We know that CPU manufacturers run CPUs closer to their limits in general nowadays, so PBO doesn't have that much headroom often and it takes a really good piece of silicon to squeeze more out of it. derBauer didn't use large sample size of CPUs to confirm his results in more general sense.
Well, electricity is really expensive in Europe and especially in Germany. So if the Intel CPU uses almost twice the power, a more expensive power-efficient AMD CPU would still make more sense. Especially, if you are using the PC every day for several hours.
AMD CPUs consume more power while on idle than their Intel counterparts. Only the instabilities of Intel 13th and 14th gen CPUs cuts the deal for AMD now.
After watching DeBauer's video I've come to the conclusion that to get a true picture of the generational uplift you really need to run the 7000 series in Eco-mode or use the similar overclock settings AMD was using as defaults for the 7000 series on the 9000 series. Both AMD and Intel were pushing the CPU stock settings higher than they normally would just to get that few extra percent processing power and AMD (correctly) decided going back to the old way of more conservative settings but with headroom to overclock if you wanted. Just enabling PBO on the 9700X gives a significant gain in performance
Jay, you should do handbrake with x265 and AV1. x264 is becoming legacy pretty fast these days and it's likely they didn't even try that in benchmarking for the launch. Just a thought.
@@DanielPetti You beat me to it ... I came here to suggest that it was added full (not double pumped AVX 256) AVX512 support in Zen 5 that makes the difference ... if you're testing something that can use it.
The cost of the cooling solution should also be part of the price comparison. Going with a $20 cooler with an AMD CPU while needing a $100-$200 cooler for the Intel CPUs to avoid high temps and throttling, should be part of the pricing.
not just that, electricity costs, too. call me old, but efficiency matters more to me than raw performance or "value propositions". it's why i've been choosing AMD over Intel since Ryzen+ and Nvidia over Radeon.
This is why I'm a huge fan of using proper SYSTEMS to test a component. If you need to spend more on the motherboard, cooler, PSU, all because the powerdraw is crazy, that adds up so fast. Also all that extra power draw is exactly what Intel is having so many chips die on them. I hope we see the end of 300w CPUs very soon here.
Thank you for helping me get into computers, this page is single handedly one of the most helpful I have come across 🤘🏼 wishing you guys all the success
Jay, are you testing Intel Extreme vs Amd PBO unlimited? Because that is what you should test. Otherwise test the most restrictive profiles for both AMD and Intel.
Jay, you need to list the power used by the CPUs when running some of these tests. I don't care if Intel beats AMD when it requires 2x the power to do so.
But he does tell you? Both AMD processors ran at 65w while the 14600 and 14700 ran at 180w and 230w respectively. Going off memory, but those values should be pretty close
When you look at the wattage being used it's impressive that the ryzen's are able to do rather well but they are not generational leap processors, fingers crossed next time it'll be something amazing to talk about.
From what I've seen I'd suggest these are actually a pretty big generational leap when you take the power limiter away, which is why they can match the competition while consuming a measly 65 watts
@@foldionepapyrus3441 It's interesting that they called these X processors when really they are base models you can OC like all their other base chips.
I got a 12900ks (basically 13700k) for £200 ($250) about 2 years ago, I know bargain! and no issues with it, XMP no issues, and still would obviously keep up here!
There are some other things to consider as well. upgradeability for example. If you choose a 13600K you may be able to upgrade to the direct follow up generation but after that you will most likely need another Mainboard as well, while AM 5 will probably stick around as long as AM 4
As consumers we seem to forget we are not these companies largest or most profitable market. These low end/mid range cpu's are often used for servers (take a look at hetzner's offerings for example) where power efficiency is a HUGE deal. As a consumer and enthusiast what I'm curios about is if it can put up these numbers at 65w, what is the oc headroom and how much power can it handle? Everything has been binned and pushed to the brink out of the box for years. Also if your competitors only recourse has been to crank the power to keep up why not rub it in by dropping yours?
PS: I haven't watched the whole video, I stopped after Jay described his benchmark specs. Maybe he explains it later, but the settings make his results not comparable and not worth my view. On to der8auer
You mean like home servers? Data centers mostly use the epyc and xeon line-ups. Occasionally a top tier desktop chip is used if it needs that single core perf.
@@manon-gfx game servers and smaller servers use a lot of consumer chips. as i said you can check hetzner for an example, large scale data centers of course use epyc, etc.
@manon-gfx I'm more talking about website hosting for smaller businesses etc. You don't need that kind of compute, cost, or power consumption. The other part of it is that they sell massive amounts of these type of chips to OEM's who like the lower wattage because they can skimp on cooling and power delivery. The return to 65w likely means they are cooking up new GE parts (35w). Which will be used for new generations of thin clients. Lenovo is the largest pc seller in the world and ships literal boat loads of them to businesses around the world. Most of the outdated businesses towers are replaced with thin clients now
@@TheMaztercom real boost will come presumably with Ryzen 7 9800X3D, since draws less power, it will allow higher frequencies and performance overpassing the Ryzen 7 7800X3D which gets the most fps already.
@@TheMaztercom Yeah, bro... not what I said. What I'm saying is that it's important context. If the 9700X were pulling the same amount of power, that would be pathetic and not clarifying what's going on gives people the impression that's what's happening, when that isn't what's happening at all.
It would be awesome if these benchmark videos included Dawbench as well. It's hard to get a sense of how new processors match up when doing audio editing!
Thank you for actually comparing these to Intel and actually saying "It's your decision on whether or not to go Intel" - Someone like me would rather have the ease of use with Intel and RAM actually working and running at their XPO speeds. I've got an i7-4700K and I've had zero blue screen or instability, not saying it won't happen. But it's almost guaranteed that AMD will give you headaches when setting up (as Jay said they dealt with it all). Not shocked that AMDs charts are so off, they've been caught lying in this release already. Don't trust anything they say or do, they've got to cut too many corners to compete with Intel and it's clearly not working, these results are shocking, I didn't think they would be THIS bad. - Sure they're efficient... but I'd much rather have raw power over efficiency... Especially with a new launch... Drop power-hungry, powerhouse chips to get enthusiast interested, then drop the efficient chips later for those who care. This is more of a refresh...
@@David-xl8zfMy 12700K reaches 158 watts max with 12 cores and 20 threads. The 7700X never had good power efficency in the first place (in fact, it's just bad, a 7800X3D similar perfomance and half the wattage) the 9700X is a massive improvement.
@@David-xl8zfA 9700X at 150 watts is still worse than a Ryzen 9 7900X. The Ryzen 9 chips and the X3D chips are the ones with good perfomance per watt, vainilla Ryzen 5s and Ryzen 7s are so overshadowed by Core i7s and i5s.
@@saricubra2867 You can´t compare R9s and R7s on a significant nominal base relying on their power consumption. An R9 will always be better at 150W even single core. Overshadowed is relative. As I said, AMD CPUs have the handbrake put on in terms of their power consumption, more so than Intels. The "default" settings are a joke in my book which is why the new generation ends up with "meh" reviews at best. At least they aren´t eating themselves, yet. Although I have a strong suspicion regarding the recall before they were released.
@@dragoonxgamer idk bro im using my 7600 non x at 5.2ghz allcore with 1.1v and 1v soc (can do 5.5 1.2v core but i like to stay under 65c on full stress). Imagine the undervolt and oc + temps on those 65w tdp chips. Maybe it draws more power from pbo idk i dont use that. He said that the chips stayed at 65w while testing so they didnt put any oc or pbo profile in those tests so they could be even better.
just built my new system yesterday, SFF build with the 7800X3D paired with a 4080. coming from a 3900x + 2080ti the perfomance increase is outstanding, games I could barely run i'm now running as if it were nothing. Can't see myself upgrading to anything again for another 4-5 years as I did with my last build, I can't see many games being more demanding than they currently are in the next couple years anyways.
Brace yourself, he's about to cuss you out because he doesn't understand how expensive electricity is in some places. Places with 50 cents/kWh will cost $58.44/year to run an extra 40 watts for 8 hours/day. Much of the UK and Germany is between 50 and 60 cents / kWh once you convert from Pounds and Euros.
@@pwn3426 Having two PCs is a waste of space and a pain to switch between tasks. Imagine flexing at someone who spent 4.5k backs on his rig. lol. Intel rulez kiddo, accept it.
It's the bios, there is something up with the bios. You experience these issues on the 7000 as well if you update the bios. I have been digging in on this and troubleshooting it. But something isn't right, it's causing my 7900x to not hit clocks that I have had before the update ( 5.7 ghz single core/ 5.4 milti core ) after update ( 5.5 ghz single at best, and 5.2 ghz multi ) I have also noticed the thermal throttle limit is now 80c, even if i set the limit to 95c. Maybe it's something to do with the new curve setting, idk, don't have the 9000 series to try mucking about with it, but I just have a feeling it may be related. Also I am using a Asus ROG Strix X670E-E gaming wifi mobo.
I am not sure, this is a fair comparison. You tested the intel CPUs in extreme power delivery settings, yet PBO is disabled. 7:28 Turn it on, and you see what happens. (Spoiler: They rip)
What makes these CPU's frustrating is that, if we aren't happy with how they perform, going out and buying Intel instead is one of the most questionable decisions you could make when building a pc right now IMO. So now, we're stuck with mediocre AMD CPU's and Intel time bombs. It actually really sucks
Yeah... Was thinking of upgrading my 5800X into the 9700X if the rumored performance gains were true. Sadly they're not. Will have to see, Der8auer tried manual overclocking and there actually seems to be fairly good headroom for that. Still, needing manual tuning to reach what was advertised isn't exactly a good look for AMD.
@@oxaile4021oc AMD is very simple. And most importantly you can set the temperature target to, for example, 90 degrees, so you won't fry your CPU like with Intel
you get a pretty big performance boost with pbo. gains like 15% ahead of 7700x just really weird to bog the cpu down so much with power draw as default tho
Kudos for running test at JEDEC , documented warranty covered RAM speed. But Intel and AMD 9000 series officially support 2x2R DDR5-5600. I have issues running outside of documented warranty setting but this seems to gimp fully supported settings.
Would love to see how they stack vs my 5700x3d. Playing 1440 with my 6700x, my guess is none of these will improve my rig by enough to make it worthwhile.
@@MisterKrakens with the same power draw, yes, the 7700X needed 50W more to match it, but then the 7700(non-X) is 2% under it, with the same power draw as the new part. nothing makes sense
Those are absolutely awesome CPUs for pretty much anything a consumer would want to do. I can see AMD having problems down the line where their 5000/7000 CPUs are "good enough" for normies for the next decade or so...
I hear ya as with my main 2 systems(desktop, and laptop), I'm still on 12th gen intel with a Core i9 12900H on my desktop(yes an Aliexpress special Eyring mobo with a laptop chip), and in my laptop an Intel Core i5 1235U, and they crush everything I throw at them on Manjaro Linux, and I personally don't feel the need to upgrade them yet for at least a few years.
I'm still rocking a 5950x so I'm not tripping right now, not to mention they just released a few new AM4 cpu's, it doesn't look like AM4 is going anywhere anytime soon.
I have a feeling this will be one of those cases where AMD doesn’t perform well at launch but ages really well once they get their drivers figured out. LTT mentioned in their video that when they enabled PBO performance went way up along with power usage. Which might be fine for most desktop users.
I'm on my PC almost 15 hours a day for work and gaming. I'm pretty satisfied with my Ryzen 7 7800X3D with a Sapphire Nitro+ Radeon RX 7900 XTX. My electricity bill dropped almost $200 a month with improved performance. Pretty amazing! My previous build had an Intel i7 3770K with an EVGA 1070 FTW.
I guess i'd still go with AMD because i don't have a nuclear power plant in my backyard and since i'm from germany i gotta save what power consumption i can. Power is fucking expensive here because of our... politicians that are really trying their hardest... To fuck everything up...
So glad to be a 13600K owner. Zen 5 is really disappointing... Ryzen 9600x should already have 8c/16T. It's been 7-8 years since Ryzens came on the market, and at least a little difference would show.
You are comparing a 14 core (6p/8e) CPU to a 8 core CPU, which is not a fair comparison. What you should have compared it to is a Ryzen CPU with 14 cores, otherwise you are comparing a Ferrari to a Fiat Uno The Ryzen you should have used (that also was released in Q4 2022 is the AMD Ryzen 9 7900X that have 12 cores If you do that, then the 13600K don't look that good 13600K's 14 cores/20 threads vs 9 7900X's 12 core/24 threads 13600K's base clock at 3.5GHz P-core/2.6GHz e-core vs 9 7900X's 4.7GHz 13600K's max turbo clock at 5.1GHz vs 9 7900X's 5.6GHz
@@ctk4949 it was a question people could do simple math to solve on their own. Performance/watt is kind of a copout when no one asked for laptop cpus to put in their mid-tower computers and charging roughly the same price. AMD should have released cpus that performed better with higher TDP instead of worrying about peoples' power bills (which still isn't even a thing since most people don't run CPUs full load 24/7).
@@ctk4949 same, I didn't came here for a while but the way this guy answer to anything is rude. He won't care because of his number of course but I've unsubscribed, last time I follow a launch here.
The moment you explained the type of crashes the amd cpus caused I instantly figured that it had to be a ram issue. So you should do some retesting when you get some updated proper bios for the amd systems. I had similar issues with my system until I update the bios and then after that I had a rock steady system that was only bothered by windows weird issues. But then I also made sure the ram I used was on the QVD list that covered my cpu and on AMD cpu systems that seems to be sort of important.
JayzTwoCents way of interacting with his commentary section is sure peculiar. He only answer comment that disagree a bit or more with him and he's super rude. 😅
The 7600X released in 2022 for $299, which is $334 in today's money. And the 7700X was $399 on release, which is $445 in today's money. It's $359. That's a win for consumers. Stop whining about current gen sale prices being lower, that's literally how this always works. Be happy that we're getting lower prices.
But what’s being compared is the current price you can buy them all for. Launch price being lower when adjusted for inflation is great, but the 9700x is currently a good bit more expensive than the 7700x (which it performs nearly the same as but is more efficient) and the 7800x3d which stomps on it in gaming performance which is what most people likely care about. Prices from 2 years ago will not affect current buyer sentiment.
Twisted logic. You don't charge premium for it being "new", you charge it for it being better. When the "old" alternative is much cheaper and still kicking - GL with those prices. Imagine pricing these nothingburgers even higher. lol.
@@flinx Oh yea. Made a trip to micro center got the bundle deal. Switched the memory and the motherboard out CPU was still only $224. Cheapest I could find new anywhere online was $384. Yupper I will take that deal.
This is great for me as a music producer who enjoys gaming. Single core speed is the most important factor with audio production. And saving power is important to me. That makes the 9700x more compelling than the competition.
Didn't think of that, pretty Impressive, in just 4 years they managed the same performance with half the cores at almost third of the power. Why would you change a workstation cpu with a mid tier tho?
Someone check me on this, but I think the power consumption isn’t as expensive as I thought: 13600K vs 9600X with 125W and 65W tdp (not perfect measurement, I know), used at those exact levels for 10 hours/week for a full year yields $5.62 higher costs for running the Intel (if you pay 18¢/kWh)
But Jay tested the Intel chips with the extreme power profile. The 13600K would consume more than 180W with those settings while the 9700X would consume 88W. In that case, Intel would cost about $8.6 higher in power per year. But that's ignoring the more expensive components needed for Intel CPUs, ie. cooler, power supply, and maybe motherboard depending on other features. Not sure why Jay used the extreme power profile when he knows how much we trust Intel's power management right now
@@valdrich472 yeah, exactly! I also saw that on his benchmark settings slide (at 7:29) about the Intel extreme power profile. I wish he would have discussed the other components like you mentioned too. It seems odd for him to overlook some of these fairly basic considerations.
The math is basically impossible to do but you also need to take into account the extra power your AC unit would need to keep your room cool. That is almost always the most significan factor when it comes to your power bill. That being said its probably no more than 10-20x your value. Pretty crazy how cheap power is
@@michael.petraeus Agreed! I just did the basic exercise after reading a few other comments about upfront CPU costs vs future electrical bills (very much only one angle of many). Jay not addressing some of this made me feel it was not as objective as it could have been.
@@destiny_02 true but with the 7700 being so close to the 7700x and the 9700x basically being the same as the 7700x in terms of performance just get the 7700, your saving yourself money, you don't have a huge performance loss and your getting the same efficiency.
@@AssassinIsAfk Nope, NOT the "same shit". Do you pay the power bill in your household? Probably not, since you can't see how that changes things. At 50c per kWh, I will spend an extra $1.50 per week to run the 7700X, for the time I run my PC (80+ hours per week). That's 44 weeks to the "break even" point when the extra power consumption cancels out the purchase price difference, after which the 7700X is costing me MORE than if I bought the 9700X in the first place. Since I intend using this CPU for more than 1 year, financially my best choice is the 9700X...
One big key you didn't talk about in the average gaming fps charts is power per frame. Basically the Intel chips look like they are dominating but they're running more than double the wattage in some cases. Put in a normalized average fps chart at like 65 watts. Yes you talked about it at the end a little but most people probably didn't make it that far.
65 watts and its competing with the big boys. People always thing of the upfront cost, but forget electricity, depending where you live, can be fucking expensive.
If you knew how to actually calculate power usage you would see that you would never make back your money when comparing actual real work usage. Obviously no one uses full load constantly
there's also the cooling and PSU to be taken into account. Both CPU seems very easy to cool and will allow a cheaper PSU with the GPU. You won't need to waste money on AIO to get cpu that stay chill.
Sure little bro, every generation should be the same just with less power, are your brain small? That sht should be in the same generation, New cpus from the same generation but with less energy usage, but a new generation? Fanboys are fanboys
Just felt like I had to share this; I have a 12900k that I crippled in multicore performance for the sake of gaming performance. I'm currently running 8p/4e with hyper threading off @5.2/3.9GHz 1.36v. Temps don't climb above 83c during stress testing and my CBr23 scores at these settings is between 19-19.5k multicore/2-2.1k single score. On par or better gaming performance than the 9700x as an entire bundle that's only $40-$50 more than the 9700x itself? Us late 12900k owners hit the jackpot. $400 for a 12900k/Asus Prime z790 and 32GB of G Skill DDR5 6000 was absolutely insane. Not sure how they do it but THANK YOU MICROCENTER!!!
Given the numbers being posted on 9000 so far, I'm going to at least wait and see how Arrow Lake looks. I was ready to jump on a Ryzen 9900X, but this calls for patience. I do like the reduced power draw, no doubt. It's nicer on the power bill, and easier to cool. Still.... will be patient for my new build, and see if Intel can get its act together with Arrow Lake at least.
@@Definedd I'm not looking for bleeding edge gaming performance rather I need it for productivity which I feel this is gonna be miles ahead of the r5 3600 that I own.
2 things: Try faster memory Try unlocking the power limit ( it is kinda overclocking, but it is clear AMD hard restricted their CPUs to be able to say they are more efficient)
For me personally, power draw is typically my top aspect to look at. Watching them get the same performance for60% less power is a massive win in my book. Plus you get access to the new generation stuff (when it becomes cheaper) is a nice bonus.
i'm not satisfied with this review. you can clearly see that the 65w tdp hinders multithread performance. yes, this choice was made by amd and its not your fault but you could have at least checked if the multithread performance scales with a little bit more power. you even mentioned how much power the intel cpus were using.
So your complaint is you were shown an out of box experience instead of an overcook attempt? How is it their (jay) responsibility to hide a short coming?
Brother, im on 5600x, and it looks im either staying till 9800x3d if its gonna have alot of uplift compared to 7800x3d or if intel releases something better...
with the tests, could you take the score and divide it by the CPU power draw, performance per watt would be nice to see, thats where i think these new chips will win
I'm still running a Ryzen 5 5600 (non-X), and with a 4070 Super I have yet to run into any gaming issues. I only play at 4k60fps but this combination is solid. I guess I'll wait another generation or two?
@@surpriseblueviana3803you need to "get to earth" and learn what under volting is 🤦🏻♂️ and as tested, these are OUT OF BOX testing as of today. The microcode fixes are not available yet. But even then, they won't make that big of a difference other than heat.
Hello Jay, Please include in your graphs a minimum of 1% and a 0.1% stack. If we are speaking about ~150 frame rate, that 0.1% is annoying already. You can feel it, but you do not from where it comes... A very nice and simple review. The blue screen is indeed a hidden bonus for the new gen CPU, and unfortunately for intel it is even for their older generations. I will stick to my 78003DX for a bit longer...
The main upgrade between the 7000 and 9000 series is power efficiency. While it was acknowledged that this is the case, and 1 graph of thermals was shown, I feel the showcase/video title processors strong suits should be showcased in a little more detail. Also definitely not even considering an upgrade until the 3Ds come out.
thats the problem with fanboys, they dont see the positives of the counterpart, only see their flashy intels sitting there dying of voltages, taking a whole nuclearplant for running at 2000f, i think u get me there. hes pissed that intel fked themselves cuz they think they still the number one lel
"Faster" 😂 mate , its one of the lowest generational improvements... I got an amd cpu myself but amd fumbled hard to take advantage while intel is imploding
Except they're only marginally faster and 7000 series chips have been available for less than their MSRP for well over a year now. So they are actually a lot more expensive. The tray version of the 7700 which is probably the most accurate 7000 series comparison (no included cooler, same TDP) goes for around $220 vs the $380 of the 9700x. The 7600 can be had for around $160 vs the $280 of the 9600x. That kind of price difference for a single digit performance improvement is crazy.
@@Dragoneye2427 It's simply not. At stock settings, AMD chips perform at exactly the stated TDP in an all-core workload under the highest boost it can achieve. Intel measures TDP with an all-core load at the *base* clock. Therefore, draws more power than the TDP would suggest. AMD draws exactly what it would suggest. If you then go in and enable other settings, then yes, they will draw more power... obviously. But, to state that they draw "80w base" or "140w at stock" is simply nonsense.
I mean, Jay, the 16 core variant is going to perfectly suit you. please go for it when its available.this thing is better than 14900k when simultanious threads can be reduced from what the 14900 can provide. the efficiency alone is worth the upgrade
These Zen 5 cores are gonna do great in the handheld like the GPD WIN4 and the likes. A 30W chip won’t be as ‘cut down’ as it used to. However for the desktop, I’m not too impressed and my 5600X is happy to stay where it is.
If AMD really made their CPUs more power efficient - that's all they had to do, in my books. Most people do NOT play games. And they do NOT encode videos either. We don't give a flying duck about your bloody "CPU speeds". Give us more efficient CPUs every day! So that we don't have to buy CPU coolers the size of football and we don't have to buy 10 other coolers for PC case, its gotten absolutely ridiculous now! GIVE US MORE EFFICIENT CPUs!!!
I disagree. Building a new architecture for a whole new generation of CPUs after the highly successful 7000 series and impressive 8000 series APUs only for the new chips to simply be more power efficient would be a huge let-down for such a renowned CPU manufacturer like AMD. It's basically rebranding the 7000 series chips but with a lower TDP, which would be disappointing at best. Despite the success of the 7000 series, Intel's high end 13th and 14th gen chips (taking aside the obvious issues) are still faster, at least for productivity. They needed to step up their game if they wanted to take over as the CPU king, and efficiency wouldn't cut it at all. Contrary to your statement, most people buy Ryzen BECAUSE they're so much better than Intel for gaming, especially the X3D models. As I previously mentioned, to build a new architecture from the ground up simply for it to be a copy of the previous gen chips with better thermals would be a waste of time, money and resources. It would make AMD laughing stock.
And if you don't care about the damn CPU speeds, then might as well get an Athlon by your logic. Faster clocks = faster overall performance in your PC, be it gaming, productivity or normal daily tasks like browsing.
I'll be honest, I went from the 5600x to the 9600x. I've noticed a 20% increase in frames at max settings in game. And my PC seems to be more responsive as well. I haven't overclocked and don't intend to. I think it was worth it overall when moving from am4 to am5. I experienced some of the crashing until I removed all older drivers and previous windows 10.
Regarding the handbrake performance, I think you might want to check if you are using AVX 512 is enabled on Handbrake. Pretty sure that's where all the improvements come from.
@@SSYoung125 Intel state it affects all 13/14 gen parts with a TDP of 65 watts or more. (That includes the 13600K.) [I'm not saying that I agree with "What", just that you're wrong.]
I am not surprised at all since there is no increase in L2 and L3 cache. I am not sure what AMD was thinking when making Zen 5, but I can say, they were not thinking much.
@@Basil_Ieaf pbo activates 105 watts and performance is no more than 1% so there goes that argument. and btw, this is a desktop cpu, i rly would take even a 150w AMD cpu with much better performance.
@Basil_left You can powerlimit the 7000 series CPU's (or buy the non-X SKU) & get the same margins though. These really don't move the needle on the value for money aspect...
The power efficiency of these new processors cannot be understated. At a time of rising energy costs and pressure on many governments to reduce energy usage I can see efficiency soon being the driving factor for consumers over performance. In the EU electricity per kw/h is pretty much twice the price of what it costs in the US, over the lifetime of a CPU that's a lot of money that can be saved.
The power efficiency seems very exciting for the SFF PC community, 65W TDP is quite a magical number for low-profile air coolers. Personally I'll wait for the new X3D chips, though.