We're currently investigating a possible discrepancy within a subset of this video's gaming benchmarks. We are confident our findings will not change this videos conclusion that AMD's X3D chips are the better choice for gamers. That said, this video will still be updated if any numbers are found to be outside of acceptable margin of error.
We've come full circle. Looking back at AMD in 2014, pre-Ryzen, versus now in 2024 is crazy. Intel used to have an unchallenged crown. AMD has engineered themselves out of an impossible hole.
@@teknoid5878 currently, intel is at 130k, amd 40k, but still, thats nearly ~3x difference, how does intel have that many employees and still manage to screw up? Edit: Fixed my miscalculation, I wasn't thinking clearly because I had a workout.
8:02 If lower is better then lowest temp should be at the top of the list. Humans innately view up as better so it makes it intuitive to put the better thing at the top.
Their graphs are just atrocious. Like how randomly just seem to base positioning on either 1% lows, averages or in some cases no apparent reason at all (such as whatever the hell is going on at 3:50).
@@JJFX- I think you're being a little dramatic. For the graph at 3:50, the sorting is a little strange, I wonder if they are using the average between the 1% and average values.
@@theslenderfox I'm being a bit hyperbolic because graphs aren't complicated and they're one of the largest tech channels on the platform. I stopped trying to figure out if there's logic to it but that doesn't seem to be it either. You'll find strange placements on a few of them.
as an European where electricity is a bit more expensive,i like the efficiency, plus they're cooler,less heat in my room,especially now in the summer where it sucks even more
The summer aspect could apply to Americans too pumping less heat into the room the AC doesn't have to blast as hard and AC is expensive to run for hours even for low electricity cost areas.
People used to not care about power efficiency when CPUs drew 20 or 50 watts, but now, a PC can be a veritable space heater and if it's always on in the summer, it matters.
How many years of slightly lower energy consumption, to pay off the increased price of these CPUs though. Of course, assuming you don't enable PBO to reach the advertised speeds. Had same issues with 3600X and with 5800X3D. I did 'fix' the issue with undervolting, but that shouldn't be required to reach the advertised speeds.
Linus, a slight suggestion, can you guys improve color coding so it is easier to see at a glance. Like you use red bars when HIGHER is better and blue bars when LOWER is better. (or highlight lower/higher)
Babyface Linus: Base Form Bearded Linus: Second Form Trimmed Linus: Third Form Slim Shady Linus: Ultimate Form Edit: HOLY CRAP THE LIKES ON THIS!!! Thanks, everyone!
@@bukujutsu8382 Yes, also future series whatever they are named. Just pick the correct B650E , as there are sometimes differences in pcie lines, features etc.
Absolutely incredible that the 5800X3D is punching this hard compared architectures that are 2 gens newer than it, and still winning in a few scenarios. Long live AM4.
As someone with an AM4 socket in a 5 year old PC, long live AMD. It's CRAZY that I still have purchasing options 5 years later for this socket. The sort of consumer focused approach I need from every single brand out there.
As a teenager.. never cared about power consumption… OC everything! i9-x all the way! As an adult.. Holy crap UNDERVOLT EVERYTHING (I live in California.. we pay up to 60c/kwh 😭😭😭😭)
@@mirek190 yeah.. I have my friggen AC set to 28C from 4-9pm which is when it's 60c/kwh it uh.. gets hot in my gaming room.. I still have that i9-9900X 🥵 9900X3D might be the next upgrade 😂
Yikes. That would make your electricity arguably more expensive than ours in Germany now, where last year's insanity has thankfully calmed down a fair bit. I guess solar power setups are selling like hotcakes? You could go broke just from running the A/C like that. (Side note: I highly recommend the recent Technology Connections video on awnings. _It's just physics, it's not rocket sc... errr..._ anyway, you get my drift.) Undervolting goes way back for me to the mid-2000s, when I was running dual passive-cooled PIII-667EB @ 500 MHz @ 1.5 V in modified Slot 1 adapters. Nowadays ThrottleStop is my best friend (Rocket Lake isn't the ideal basis and I envy my parents' i3-12100, but yeah...).
9:42 to be fair, on mobile having a dedicated accelerator is much better than on desktop where power efficiency, performance and thermals are so different
AND in 24/7 datacenters and servers, as well as workstations, where time is money and using more electricity than needed is money meaninglessly going out.
@@kiloneie Yes, but it needs to be covered by warranty, or at least there should be a power limiter setting, either in the BIOS or in Ryzen Master, that AMD is willing to cover with warranty.
I legit just came back to this video after watching jay and gamernexus. idk why linus's results were so good compared to intel. Edit: so yeah after looking into it. Yeah their results are the same. Its just that they talk about their results so differently. Especially comparing it to intel in productivity and its current pricing compared to the current market.
As evidenced by their recent issues, I've come to understand that LTT is not the channel you watch for accurate reviews of computer related parts. Even if they have the lab now, I'm not sure why this is still the thing with their reviews.
LTT did the same thing about the Snapdragon review. LTT cherry pick results to put the product in the best possible light so the sponsor stay happy. These are he same scumbags that tested a water block incorrectly and refused to spend $500 to do proper testing saldering the companies reputation in the process for bad result due to incorrect test and sold off the property when the owners requested it back.
@@michaelkeggereis3232 Nvidia cant hear him over the sound of leather jackets flapping in the wind (created by the various data centers buying their products)
@mitsuhh I am referring to overall wattage. Going to need a power supply just for the GPU. The rumors have it the next gen chips will up the wattage on every skew
I am very confused how this is the only tech channel that shows the 9700x beating x3d chips when all other channels have the 9700x lagging behind significantly
and the cycle continues 2004 - Intel flopped with Pentium 4 (preshot) while AMD is winning hard with Athlon 64 and their first dualcores 2014 - AMD flopped and almost went bankrupt with their flop of a bulldozer arch while Intel is winning hard with their unbeatably efficient Haswell architecture 2024 - Intel flopped with their hot and unstable Raptor Lake CPUs while AMD is winning hard with Zen 4 and Zen 5 Though AMD in 2024 isn't competing against Intel anymore, theyre currently the only company who is able to fend off the ARM invasion and they should continue to innovate or else Apple and Qualcomm will take their place, so no AMD won't give us 5% performance uplifts and infinite refreshes unlike Intel back in 2014.
RMA quotes even of the 13900K/14900K don't really show any difference to Zen 4 - yes there is a problem, but after over 2 years still almost nobody is affected. And these 99,99% who are not affected even get 5 years of warranty ... According to the RMA quotes a similar effect as the dying Ryzen 7000 CPUs a few weeks ago that had to be fixed with a AGESA update ... then we have 1% difference in applications and 4-5% difference in games and not even a Cent in difference for energy consumption - both Ryzen 7000 and Intel 13./14. Gen get very close to 100 degrees, so no real difference there - do you even remember how bad bulldozer was compared to Intel at that time? Clearly not - in Crysis: Warhead you could be happy with half (!), not as nowadays non-noticable 1-5%, FPS ... Intel still has more than double the turnover of AMD investing billions in fabs and even having billions in cash + way over 70% market share... AMD almost had to close the company and had to get money from the state to be saved, completely different situation ... back to now - Ryzen 9000 - Hardware Unboxed: "Disaster" "Extremely bad value", AMD says Ryzen is making a lot (!) less turnover and that was before the underwhelming 9000 series was released ... Epycs are great and market share shows the difference - desktop not so much ...
@@SURO90 Myself and several people I know have an i5-13600k for the last year to 1.5 years with no issues thus far. I think it's definitely a small number of people but they publish it like it's 25% of users or something.
That's a double flop for Intel, first they canceled Meteor Lake desktop because it was slower than Raptor Lake, THEN Raptor Lake turned out to be a broken mess.
In my opinion AMD is “unpulling” a lever for future use, AMD Ryzen had this kind of TDPs, but as Intel got desperate pulling from every lever it could like frequency and power AMD decided to also crank (not to the same degree of course) the power lever to stay clearly ahead of Intel, that’s why we had a considerable increase in power draw/TDPs with Ryzen 7000, I think they saw what they could expect with Intel in the near future and decided to invest almost all their architectural performance improvements in power efficiency to leave the TDPs like they were prior to Ryzen 7000 while being on average a tiny bit better than Ryzen 7000, lower default power draws and higher efficiency as before maintaining a good performance, and Ryzen 9000X3D will deal with Ryzen 7000X3D. So I expect a minimal uplift in performance this gen and in the next ones same TDPs as this one so all performance improvements will be noticeable. The only exception to this correction on bumped up TDPs compared to Ryzen 3000 seems to be the Ryzen 9s, like the Ryzen 9950X with 170W TDP like the 7950X vs the 105W of the Ryzen 3950X, I assume that they preferred to leave those untouched either because they thought there was not that useful strategy at that tier of because they planned to use the Ryzen 9s as spear heads to show off all the performance gains of this gen, so people don’t look to much to the restrain of the other parts and also to eliminate the possibility of a power cranked Intel Arrow Lake winning
Conspiracy theory: The delays were caused by AMD pushing a BIOS update that changed the 9700X from a 105W CPU to a 65W CPU after they saw the Intel disaster. Would explain the regressions, being as they wouldn't have time to make sure all tests outperformed previous 7700X. It's unlikely to be real but if anyone wants to compare BIOS versions from May that supported then-future 9000 series vs those released after July 24th to check and be sure...
Arrow Lake is going to be a joke, Intel pulled a Microsoft with choosing HD over Blu Rey but unlike Microsoft they refused to backpeddal and admit they were wrong and decided to double down instead. It is quite literally killing their business IMO I would love to see them pull their heads out of their asses and stop cpu manufacturing all together, it's a joke at this point and only drags their name more through them mud Switch to only focusing on their gpus which are actually GOOD and be happy with that before it's too late
@@mokahless It could sound feasible but I’ve following leaks for Ryzen 9000 for quite a long time and they leaked quite a while ago, pre Intel disaster that the 9700X and 9600X would have 65W TDPs in fact not long ago it was rumored (and could be faked) that they were considering do a last minute change and elevate the TDP I don’t remember if to 120W or 165W (I think the latter) to not look that far away from the 7000X3D, but ended up deciding to stay with the original plan and not push that much power, and I would say that those rumors were just before all the Intel debacle, since then all leaks are centered to what happened with Intel 13th & 14th gen
@@mokahless the best infromation we have is there was a validation error with some early production runs and they have to rerun the validation. this is causing the staggering of the launch instead of all at once like they were going to do and will take longer for the top of the stack because they may have to down-bin them.
6:43 Only 88W? Can't wait for the x3D variant of these! Living in a tropical country, those new CPU/GPU space heaters is just.... Now, if they make a GPU like this, that would be something else!
Look up how much power a Ryzen 7700 consumes compared to a 7700X. You guys failed to realize that these CPUs are non-x CPUs in disguise. Those efficiency numbers wouldn't be that noteworthy if LTT had added a Ryzen 7700 into the mix.
@@WrexBF No, these CPUs are just like any AMD CPU, except as if it was undervolted for 10% drop in performance which usually nets 40% less power consumption(it's not linear), which combined with a decent die shrink and branch predictor improvement, they get a small performance uplift while drawing significantly less power, WHICH is a response to Intel's problems to give people no reason to doubt stability of AMD, AND as an answer to Qualcomm's invasion and BS power efficiency claims along with all that never ending ARM hype that needs to die already(same BS as graphene).
Probably not, at least as long as Dr. Lisa Su is around. We would probably still see them improve even more. But AMD is a public company, so never forget what they did from ryzen 3000/4000 to 5000 series, when they jacked up the prices. So expect it in the future if they truly on top and intel was no different from AMD on it's amd bulldozer days, that they are going to increase the price as much as they can.
A company is a company, without competition AMD will do the same, Lisa Su doesn’t underestimate Intel they are finishing their factories and they already lost against Athlon 64s with Pentiums 4 and came out of nothing with the Core 2 Duo, nobody expected that behemoth those days.
@@epickh64 Except AMD took the lead and hasn't given it up since Zen 3. They're already the leaders, and they HAVEN'T gone full Intel, so there goes that constantly regurgitated theory. 🤷🏿♂️
@@XDSingularityIf all companies ran the same, they wouldn't need to worry about who holds leadership positions. Hell, they wouldn't need CEOs. They could follow The One True Business Strategy that people who don't understand economics swear exists.
The efficiency is the big deal here....I live in California where power is stupid expensive....assuming a steady 144 vs 88 watt draw 24 hours a day (for simplicity)...thats a 56 watt difference per hour or 1.344 kw/h per day, 40.3 kw/h per 30 day bil cycle.....on my utilities standard residential rate are 38-60 cents per kw/h depending on time of day (we have mandatory Time of Use Rates).....thats an anywhere from $12-$25 a month difference......$144-$300 a year.....JUST FOR THE CPU. This is just an example and I don't know how realistic the scenario is.....but would be worst case scenario with a 24/7 draw like that. And yes I know how to read my rate tarriff sheet to math this.....I work for my utility company.
@@ABaumstumpf 0.056 kW * 24 h * 365 d = 490.56 kWh * $0.60 = $294.336 / yr Looks right to me. Please explain where you see an error. My only question is: What is he doing that his CPU is pegged 24x7x365?
When choosing a CPU based on power consumption it's important to factor in what you will use it for. If your taxing the cpu the majority of the time AMD is the clear winner. If you are going to be idling or doing lite work Intel will come out ahead with it's lower idle power draw.
@stargazer7644 he was just using 24 hours a day every day fir simplicity reasons. Obviously not going to be running all day every day. So the money saved is Obviously marginally less but still money saved regardless
61% power used for small performance than previous gen is absolutely effecient. Just imagine if they use 100% of previous gen's power on the current gen. It would absolutely obliterate the previous gen. They are avoiding the Intel way of just using all the power it can sip, and look what happened with 13th and 14th gen.
Yes except all core loads isn't the whole story. Real world power usage in gaming loads that they didn't show is only slightly less than the 7700X and higher than the 7800X3D on average. It achieves similar performance in heavier loads like this despite the lower clocks because of the larger cores combined with improvements to branch prediction, SMT, etc. In lighter loads that care more about clock speeds it's barely improving over last gen unless you allow it to pull more power. While this is a good indication of what's to come from AMD, they have to get the clocks up for me to really get excited. I don't think they were able to get as much out of TSMC '4nm' as they were hoping but it'll be really interesting to see how the 9950X does.
I suspect they're trading performance for efficiency on purpose. Tactically beating Intel by a little bit, and getting ahead just a little bit further, so they have room to push performance at the cost of power if they have to compete again.
That's because for the first time ever, AMD had a reason to stop chasing those last few %gains for double digit efficieny loss, that people could regain by undervolting etc. For 10% loss in performance you can usually drop power consumption by around 40%, which is exactly what AMD did here, most probably as a response to Qualcomm's invasion on top of the fact that they were already listening to consumers about power efficiency last year due to the most probably artificial energy crysis.
Small note about the graphs. Some of the "lower is better" graphs were a little confusing as you had to look towards the bottom of the graph for the best results. Maybe it's a good design ideology to keep better results at the top, so it's easier to tell which is which?
Who cares if it dies? Someone else will buy their licenses and fabs. I mean, they'll have earned it. If they manage to pull themselves back up then cool.
you are true but intel lie us,,,they add fake core and super overclock gen11 and 3 gen sell it ,,,they instead recall bad cpu ,try hide problem and buy time
Watching hardware unboxed review , it seems that amd got extremely lucky with LTT game selections. I think it would be nice if Markbench could be used to expand the game selections because 4 games does not average the noise well.
Finally! A comment that brings up my exact same concern. I watched hardware unboxed right before this one and I’m really disappointed in LTT’s review. I feel like this video is just a paid AMD advertisement
@@Z-Crebs LTT focused on the incredible efficiency gain because outside of the NA bubble it is an actual big deal where electricity is expensive, they also explicitly state in the video that the last 3 games are CPU BOUND which says why the uplift was greater while their gpu bound F1 results are within margin of error of gamer nexus's results.
AMD is showing to the industry that now that performance uplift is basically negligible, they are still innovating with every series and not snowballing like intel did from 4th gen until ryzen started to appear in the market. Also, in the largest picture, it's good for the environment if we can accomplish the same task at the same speed while decreasing consumptions
I think it’s quite impressive that AMD has actually made their CPUs use LESS POWER than Intel’s, considering their switch to chiplets and the power requirements to maintain the chiplet interconnect. Kinda makes me hope AMD releases some Strix SOCs for AM5, but then getting rid of the iGPU so that we have full access to all the PCIe lanes. Those would be WAY less of a power hog.
I would agree, but if I get the same performance for less money, I'll go for the cheaper one, environment doesn't matter to me, only price and performance.
No. Every SINGLE time you see a chip consuming less power than the previous generation, it means performance has been taken from you. It happened innumerable times with Nvidia, everytime Nvidia got applauded for "efficiency" while the reality of it is that they just cut the die size and increased margins. Today it has become ridiculous. And remember that we have a new process, so that's more efficiency by default. Even worse here, AMD invested +28% of transistor budget on basically no performance gain. We just have to hope this was done to take down old Zen walls.
I would like to see an update on this video. You have only tested 4 games (which is way to few for a major consumer cpu like this imho) and hardware unboxed has replicated those tests coming to very different results on 2 of them.
I loved the animation at 5:02 made it really clear what bars you are talking about. But I think some slight color changes between amd and intel and recent and previous generation would make it a lot easier to see all the necessary information in the quick glances we get in the video.
I personally like that Power usage was a focus. So no I wouldn’t really prefer more power for barely a 1% increase. This is called good design decision.
These new generation of AMD CPUs will be a big freaking deal in countries like India that draws so much less power than Intel CPUs. Just that fact alone is enough for thousands of people to get an upgrade because electricity is not cheap here.
The most interesting thing about the launch is when you compare it with the launch of the previous generation. At that time, there was a lot of criticism that AMD had increased the TPD for Ryzen 7000 and thus the power consumption for all variants was higher. If you compare Ryzen 7000 with Ryzen 5000. Now AMD has listened to the criticism and adjusted the TPD and power consumption of Ryzen 9000 back to Ryzen 5000 and it's wrong again? The tech press is quite strange. This is meant for all tech journalists in general. You can easily increase the TPD on all AMD CPUs if you are okay with more power consumption or you use PBO, for example. Every user can decide for themselves. But back when Ryzen 7000 was launched, there was criticism that AMD had increased the TPD and that the power consumption was too high compared to the performance gained by keeping the old TDP. The criticism at the time was also that this meant that every buyer had increased power consumption and not just the extreme tech nerds who overclock their CPUs or who need the higher performance.
The reason why is that you have to OC to reach that performance number that the CPU easily seems capable of, at a still reasonable power consumption. That means a customer's warranty could be impacted to get the most out of the CPU when that wasn't the case for the last generation. I never complained about 7000 series power consumption in the first place though, since it was always reasonable compared to the abomination that has been Intel's power draw.
It's still pretty in the air what the actual power draw for the 9000 series CPUs are compared to the 7000 series, they lowered the TDP but also reduced the base clock a lot.
I think the issue isn't people using PBO, but whether or not using PBO and Overclocking the CPU will be covered under warranty, because in most cases manufacturers will refuse any RMA even if the device is under warranty. If they determine that the CPU was overclocked, then that automatically voids the warranty.
6:15 I'd argue that a higher performance on their most expensive products is better than lower power consumption. These CPUs are used in the professional environments where employees's hourly rate costs more than a coupe hunder of watts per hour.
Went from an 8700K to a 5800X, and just got my 7800X3D. Super happy with AMD efficiency and temps. Never going back to Intel. Their instability these days is due in part to how damn POWER HUNGRY their chips are. No thanks. I will continue to reward the company who's actually innovating and getting more efficient.
Intel doesn't have a power instability issue (though that doesn't help). It's voltage. Intel is pushing these chips WAY too hard out of the box to hit those sky high clocks and most of them can't take it silicon lottery and all. I've personally seen voltages of 1.6v on a factory default bios. Yikes. With modern chips (think 2004 and newer) anything above 1.35v is too much and will see degradation in a few hours to a few years depending on silicon quality. 1.35v will very likely see degradation but should last the effective lifetime of the chip. Depending on loadline settings, 1.35v might be too much and will require dropping to 1.3v or lower. 1.25v is generally safe and 1.1v is guaranteed to be "completely safe" on anything but outright defective chips. Granted you may have run sky high voltages from day one and are still running with no issues 10 years later. The reason is because your chip had headroom (probably a lot) to spare. If you compare peak achievable clocks from day 1 to 10 years later you'll see a difference (but not a huge amount as if you're running 10 years with no issues you got a good chip). These are all at load voltages and must be taken using external tools as the software tools won't catch the momentary spikes and dips (dips are what cause crashes). When at idle, higher voltages are generally fine to run as there isn't much current running through the transistors (current is the primary thing that causes degradation) but some sensitive parts of the silicon (aka the ring bus on alderlake/raptorlake which is believed to be the primary cause of instability) will degrade even at idle. Lots of theories as to why, but unless you have a SEM or are intel you won't know for sure. Intel is shipping these things at 1.5-1.6v. No duh these chips are having issues. Hopefully the patch intel is releasing this month addresses that.
I'm probably going to take the lazy route and wait for 5700x3d to hit $100 on aliexpress. A platform change would yield larger performance gains, but the idea of taking sff machines completely apart makes me want to put off taking them apart for longer.
9:45 It's said that 9000 is more expensive than 7000 but showing an article where the opposite is true? Is this comparing 9000 launch MSRP to 7000's current sale prices?
I believe this is an error from LTT; it is clearly cheaper in all SKUs compared to 7000 series...unless I'm completely misinterpreting the Forbes table they screenshot?
Yes, launch price by launch price 9000 is cheaper, but is of course more expensive than the current 7000 price. With little to no performance upgrade you should consider 7000 over 9000 at least until prices drop
@@nikolakostic5667 at this point I've watched the gaming tests of Level1Tech and Gamer Nexus and they all are very different from HUB, I think HUB did something wrong while setting values (he said something about that and memory issues)
@@nikolakostic5667 I don't think different methodologies could make those differences. Someone, either Linus or Aussie Steve must have a defective unit.
@@nikolakostic5667 HUB's numbers seem quite aligned with other source (GN, techpowerup). It's not really surprising that LTT's (gaming) numbers deviate when they tested a grand total of 4 games...
LTT benchmarks are bad they were on a livestream once and didnt even realise one pc was at a different resolution for the WHOLE stream and used those results in the next video
So are we going to see a peak back at this? As no one else reporting this and have performance parity between 7000-9000 no repsonse on this is frustrating with your history of bad data?
Im just loving how AMD is pushing such performance from low power consumption and temps. In Australia where the weather is hot and electricity is expensive it's always welcomed
@@Limitbreakur I agree. Granted I live in an apartment but compared to other places my electrical bill isn't high despite a price hike for my electrical company.
@@noobbotgaming2173 In Quebec, with my dual rate plan, i pay 4.5 cent per KW. The catch is that i pay 5 times more when under -12, but that's rare and i use gas for heating when it's that cold. It's so cheap that my electrical use is not even a consideration at all in the summer. Still, i underclock my gpu in the summer just to reduce the amount of heat in the room.
As much as theres a lack of Ryzen 3 I feel like you could probably get a 5700x3d for the same price if they released one. I feel like they just treat it like a Ryzen 3 for the new gen cpus without rebranding it
I haven't watched LTT for a while, and I couldn't help thinking there is something different about Linus. I couldn't tell exactly what that was, but half way through video, finally I noticed that it was his hair color.
Seems AMD is taking the Apple approach. I think it’s a good strategy, since continuing to sell older products means that you don’t need to do any R&D to make a low-cost product and instead just mark down the old products to slot in to a new budget sector.
@@fujinshu It's not seem(s) like, it's true! - it's just the other way to talk about Apple and AMD have TSMC to have their chipset models, so their task is simple: Design their chips...
@@fujinshu I don't think it's really an Apple thing... this approach has been around for a good while now. Back in 1995, if you had the money you'd buy a Pentium system, while on the budget end 486 machines (generally DX/4-100 or AMD 5x86-133) still sold merrily even if the tide was turning once the Pentium 75 came out. Similar patterns go back into the 1980s... I think at some point in 1990 you would have been able to buy anything from a lowly XT-class (8088) machine to a 486.
@@fujinshu It's because TSMC does manufacturing for AMD. For AM5, TSMC needs to use more recent machines and they have a limited number of them, AM4 units can be produced concurrently with older equipment. So the choice is to manufacture old tech or nothing.
01:00: AMD's Ryzen 9000 series shows significant improvements in performance and efficiency. 03:16: New Ryzen chips outperform previous generations, especially in gaming scenarios. 05:54: Productivity performance shows mixed results, with some gains but not across the board. 08:20: AMD's efficiency improvements stem from new architecture and lower power consumption. 10:46: Ryzen 9000 chips offer solid performance, but budget options still hold value.
11:00 The B840 is basically non-sense to buy and just an upselling scäm. People can live without the fancy stuff like OC and USB 4, but PCIe 3.0 is way too old and a big step back. Even the cheapest A620 boards comes with PCIe 4.0
I really don't see the problem with AMD ignoring the ryzen 3 chipsets. If you don't have the budget for a new 9000 cpu, buy an older (even used) chip, cause the price to performance will be so much better. As linus himself said, budget cpus/gpus are almost e-waste oustide of very specific use cases
yeah I don't get why people keep complaining about the latest gen not having budget options on day1. I mean new stuff is never good for budget anyways - esp. with AMD, you don't buy on day1 if you want good value. AMD prices tend to come down significantly after a while. And since you're looking for budget not top performance, just take a look at all those cheap budget options from last gen or even earlier. AM4 is still kicking so well for that very reason, it has incredible budget options.
On one hand yes. On the other, it would be nice if they had some 9600 non-X from the launch and priced them lower. Yes - it's important to remember that AMD architecture is very efficient. So you won't have many "cheaper" bins. But they will likely release 9600 non-X regardless. They could have taken a slight hit and released them immediately which would given them better PR.
the motherboard and ram prices just dont make sense to have a low end chip anymore. although im sure AMD could still do something interesting for the low end regardless
@@ToasterTR I partially agree. It probably wouldn't be that hard to make a cheap mobo that accepts only 65W TDP chips. With PCIe Gen 4 (though 5 for now is costly AF). On the other hand it wouldn't be "cheap enough" to warrant having "Athlon" CPU's on it. Yes, you could have 9600X and 9700X on them, but then it isn't budget. Plus you couldn't overclock them, because if it is made for 65W TDP, then power delivery supports maybe up to 100W to CPU (so slightly more than 88W that 9700X uses) as safety factor.
@@RealEclipsed on the "budget CPUs are mostly money pits" that was mostly to the like Pentium/Celeron tier chips... Not the i3/r3 tier chips... And for productivity yeah getting a previous gen i5/7 or r5/7... Is similar pricing and performance... For gaming? Current gen chips with the highest possible IPC is better. We've already seen like r5 7600s compete with 5800x3d in some (but not all games) and handily beat the 5700x... In all likelihood a ryzen 3 7300x or 9300x would probably beat a 7600x or 7700x in gaming... We saw 3300xs were beating 2700xs in all but the most thread heavy games.
I’m quite happy about super low power consumption. It is a big deal in my location. It’s interesting, that when two manufacturers are not evenly matched, leading one steadily decreases power consumption, while other one increases. Happened with AMD and FX in the past, now happened with Intel.
LTT did the same thing about the Snapdragon review. LTT cherry pick results to put the product in the best possible light so the sponsor stay happy. These are he same scumbags that tested a water block incorrectly and refused to spend $500 to do proper testing saldering the companies reputation in the process for bad result due to incorrect test and sold off the property when the owners requested it back.
Thank God for AMD, I was able to get a cpu/mobo/ram combo with a 14700 (400$). Theyre upgrading to AMD, I am happy with what I have (coming from a 5600x).
@@light3267 yeah I was thinking the same thing, right from the beginning where Linus is radiating positivity, while HardwareUnboxed determined 9700X to be a flop
LTT used different games, so that might make all the difference and also focused on power consumption while Hardware Unboxed looked only for gaming performance and that did not show up in their games. I don't have problem with either approach to be honest. It would be better if we could somehow merge the reviews together but this is why it's important to check multiple reviews.
F1 numbers are the same as in other reviews I don't remember seeing anyone test rocket league and Returnal so idk about those games but it is only 4 games and they do feel cheery picked a bit (tho F1 and rocket league are always tested by LLT so many its just a coincidence )
@ThourCS2 My guess is lack of time, with the push back on the launch. Reviewers had to move quicker to run tests and produce the video. Might be neat to see Reviewers team up and split the workload to test different games and CPUs for better analysis.
In their video where they went into the process of finding several identical CPUs to use for a GPU benchmarking. They showed that Counter-Strike has some really funky numbers going on and using it as a benchmark is a complete waste probably at least partially because of how hilariously easy it is to run
@@cembaturkemikkiran4109 It is funny to me how much heat my computer dissipates when the air temperature is already at 30 degrees Celsius without it. It's noticeable, but manageable.
Saying the i3 14100F doesn't have any competitors isn't quite correct I would think. From a few reviews I glanced over the good ol' Ryzen 5600X still beats it in Gaming and is available for the same price, with an abundance of cheap motherboards available as well. Of course it's an older chip, but who cares?
I am still running a 5600X paired with RTX3070 and they can handle all the games I want to play at 1440p 60 FPS. Waiting for the X3D variant of these chips and the RTX50XX series for an upgrade.
@@linko994 14100F or 13400F can beat the 5600 with DDR5. So it's mainly the full combo u can build with. Beside the 7500F is still the best P/P for build.
@@michalsnaiberg2734running a 5600x with an RTX3060 and yea, running most of my games at 1440p 60fps. Somewhere between High and Ultra (game depending.) I know I’m gonna be upgrading in a few months so this is helpful.
8:10 - I thought that AMD changed the way temperature was measured on Zen 5? I don’t think it’s directly comparable in this way and it may be misleading to present this.
Wish the 5600X was included in these benchmarks, realistically people with this chip will be the ones looking to upgrade more than the people on newer hardware
Are you telling me that this processor has 600kb cache in thr world where nobody in the whole world would need more than 640kb of memory? Crazy! The future is here!
The 5600X is usually faster, but then you could also go a bit higher on the Intel side with the i5 12400F which beats the 5600X at the same power draw.
Really, only 4 four games tested for a new cpu lunch and the results are significantly better than on any other review? It still seems Ltt has learned absolutely nothing about accurate benchmarking... Zen 5 completely sucks for gaming and there are virtually no gains at all compared to Zen 4. The 7800X3D remains the lonely gaming champ...
Why not compare it against the 65W part it is replacing? Then the power efficiency is not that compelling any more. Plus the price has gone up, and they no longer include a cooler. The 9000 performance is underwhelming to say the least.
@@TheKirik71 That would require AMD improving their own GPUs to the point of being able to compete on either price or performance. And if that happens, I'd call it a win regardless of what happens to Nvidia afterwards.
actualy everything is going down.. look at world lol.. ofc this 2 gen fail has some effect but in compare to AMD in mar 2024 has worth 207usd.. now 128usd.. so AMD stock drop 40% in just 5 months
Quick mention that this was supposed to be on 3nm, but got ported to 4nm early on because of MI accelerator and 3nm laptop chip allocations. This is why it has issues, doesnt clock like it should, and didnt get the gains it was supposed too. Client always gets screwed
Bigger numbers that do not match most of the prior CPU launch increases with a disproportionately higher price tag. AMD already turning on the f consumers signal. It didn't even take a decade for the switch.
Because you don't even need one if you have a dedicated GPU. I'm tired of all of this NPU bullsh*t that these companies are pushing. NPU was designed for mobile usage since it's energy efficient. You don't need one for a desktop since most people probably already have a GPU.
@n_core NPU works differently from GPU and CPU as it attempts to emulate the functioning of our brain neurons. Currently, there is no utilization of anything in game development that requires NPU features and hardware. However, NPU could be used for AI NPC skills, allowing NPCs to learn in real-time against real players and scale game difficulty accordingly. In my opinion, NPU marks the beginning of advancements in artificial intelligence of NPC in video games. If gpu doesn't support neural network hardware I don't how you gpu could run it
@@vyor8837 AMD has acutally had to release an official statement addressing the failure of Zen 5 to meet claims about performance gains. Just Google 'Ryzen 9000 Series Community Update: Gaming Performance'
its noit a jump these benchmarks are fake, the 7800x3d is better in gaming than both these chips and will be a bit faster than the 9950x actually, amd has said this in a meeting. also this is the only channel to have the 9700x or the 9600x beating the 7800x3d
It would be awesome to start including Dawbench in the benchmarks so music production and audio engineers can better understand how new CPUs compare. A lot of audio production relies heavily on single core processing so my guess is the new AMD chips would fair really well
I find it refreshing that AMD is still able to have some gains while massively gaining on efficiencies, while the likes of Intel and NVidia only achieve increases in performance with increases in wattage. Like 250W for a CPU and 450W for a GPU is uncalled for and wasted electricity.
Anyone else notice the fuzz on his right shoulder througout the video? I couldn't unsee it as i watched. Until it vanished at around 8 minutes in. AND IT CAME BACK AT 8m34s