I imagine that the mini pc has some sort of magical curse that makes it slightly unoptimal for everything from 8-bit snake to fully raytraced 8k gaming
It's probably windows I have a laptop with an atom CPU and it's horrible to use with win 10. I swapped it to linux and have a much better experience, I'm not running a desktop just a window manager and it's a much better experience.
I'm running a telescope rig with it. it's 12v, super low power, and I can use RDP from the comfort of my warm desk. sadly I did need to upgrade to an nvme drive in order to grab raw video from my planetary cameras.
Just wanted to say that I have become increasingly appreciative of your journalistic selection and demonstration of the "Budget" range of technology as I have gotten older. Especially now that I have to pay for things like rent. While so much of tech journalism seems to be focused on the next, latest and greatest its important to highlight the mid range of what most people will ever need or use. Your demonstrations like the video editing on the $100 PC I think are great at proving why these lower end systems are important.
As a quick example, I accidently broke my ~$1000 phone and picked up the Samsung A03 from BestBuy for $60 and I swear to god I am never buying a more expensive phone again. Because while the camera may be shit, there isn't any app it cant run smoothly besides for maybe super graphics heavy games which I do not see as a necessary for my phone to play anyway.
@@1637tyyeah the samsung a series phones are bloody great. Whenever i suggest for someone to get one of them instead of a high end model phone they laugh at me. Got my a50 on ebay for $100 aud and got an a42 on facebook marketplace for $150 aud after i broke the screen on my a50
@@1637ty if it wasnt for the ease of use with IOS, i would also go for a relativly cheap android. because you're right, aside from the poor cameras that are generally typical with cheaper androids. the phones themselves are usually fine. and relatively fast. or at least as fast as i'd need it to be. but yeah, most people who arent stuck to their phones all day, do not need a 1000 phone. at any point.
@@1637ty The only reason I now have a ~€200 phone instead of an ~€80 phone is Android Auto, it was just unusable on the budget phone. And I agree the cameras on budget phones are not great, but if I cared I'd buy an actual camera.
@@chasethefeelimo, nobody should be a fanboy, they should be a realistic fan. you might like amd or intel and there is nothing wrong with it. just don't bash the other brand if they are doing well.
CPUs have improved so much in such a small amount of relative time. From R7 2700X, to R9 5950x, to i7 12700K, to R7 7800x3D, there was always a noticeable improvement in game perf. And from 2700X to 5950x was night and day in multithreading heavy tasks. In all that time though, I only ever upgraded the gfx card once. Went from GTX 1080 Ti 11gb to an RTX 4080s 16gb just last month.
Remember that era of i5-2000 to i5-7000 series when the processors didn't really change at all? AMD threw a hell of a wrench into the system, from affordable 8 cores with the Ryzen 1700 to consumer-grade 16 cores with the Ryzen 3950X. I can't wait for the 8000 or 9000 series of CPUs, I'm hoping for a performance uplift good enough to justify moving to DDR5.
@@2kliksphilip I agree, there should always be some sort of fight between tech companies to make better and better chips and technological advancements, because if there isn't one, then the rate of advance is going to slow down
@@2kliksphilipThere was some innovation, just basically none of it went to the mainstream/performance desktop market, it was all about power efficiency for mobile, and core count for server. For example, xeons went from 8 cores per socket to 22 cores in 4 years (2012-2016), at basically the same tdp rating. Intel just decided to do nothing in the mainstream market because amd was licking windows.
@@lookitsrain9552 And because of that mid-late 2010s HEDT Intel stuff still is totally serviceable, just at a higher TDP than modern stuff and less features. With the way things are going, it may not last long like that, but still cool. Because that's where all their innovation went until Ryzen came around, especially Zen 2.
I'm really a fan of these CPU and GPU videos Philip makes, because he does them from a normal consumer's perspective, instead of just wanking on the latest and greatest, most expensive things like many dedicated tech channels. Actually, one of these videos was the reason I switched to AMD in 2018 and I'm really happy with that choice still.
What you were experiencing on the N100 was the fact that CPU utilization can go above 100%, tools like hwinfo show it correctly, while taskmanager does not.
i had a CPU on a relatively new laptop it was around 300~ dollars, its boost clock was 2ghz and it had an hdd and i used to play CS:GO on it it would get like 25 fps on dust 2 so i would play the training map and try to beat the high score as i got 40 to 56 fps in the training ground @@2kliksphilip
What's the point of using percentages above 100%? I guess it better illustrates the "boosting" behavior, that can only be sustained for a limited time?
i had a 300 Doller laptop with a intel Celeron 2core processer and its boost clock was 2ghz i got around 25 to 15 fps in CS:GO 640x480 resolution @@2kliksphilip
@@tomallo99 It can happen for a few different reasons, one is that its possible that monitoring software is reporting over 100% usage because multiple cores are being used (whether it be physical cores or threads), another one is that i can be that a single threaded application is fighting for CPU time, another one is that a single threaded application can be utilizing multiple threads, increasing the throughput.
Short answer: The CPU Utilization is incapable of going over 100%, OSystems and HWinfo are only measuring time of the cpu threads dedicated to tasks(and to drivera+operating system itself). The additional performance counters just let you see how well your program utilizes cache and how often the cpu is able to to guess the result of branching execution paths. Longer answer: Yes a cpu has more than 1 ALU unit. Cpus execute multiple instructions in one clock cycles. And the code you run might not be up to a task. So yes it is possible for certain tasks to make the cpus crunch harder. Your code can also make the cpu hit branch misses less often(certain types of branching having lesser penalties, let's you hint to the compiler what code path are much more likely to be taken). Even Longer answer: The N100 can only use a single-channel DDR4-3200 or DDR5-4800 MT/s memory kits. Not only does N100 have a video decoding block. Demanding tasks such as video editing let's you divide the data that is being crunched into smaller blocks of data. This means that data can fit into L1/L2/L3 of the cpu (like color correction can be done say in blocks of 64x64 (which fits into a 16KB L1 cpu cache block)). This just means mutliple effects can be applied on a 64x64px scale reducing the number of transfers from memory to cache when processing data(yes even the N100 is going to be botlenecked by slow memory).
Something you didn't mention that is becoming increasingly important in MT workloads is figuring out where the application stops really scaling with more cores and which cores you should use at that core count. The main thing I do on my personal machines is software development and with the languages I use now I only really ever use 14c/28t at once. At that thread count my two options are really the 14700k or 7950x, and while the AMD might have lower performance when all cores are saturated I'm not going to be saturating them. In a 14c MT test those full Zen 4 cores are going to outpace those E cores a lot, making AMD a better option. For a lot of professionals I think the advice of "get whatever has the better MT score" is quickly becoming harder to agree with software usually doesn't scale linearly and I really wish tech media would mention it more.
just as i was about to watch this video i remembered your klik empire video and i just wanted to say that i really enjoy your CPU videos and have for as long as ive been watching them (including bygone era advice cos they are fun) i hope that you enjoyed making this video because i would love for you to continue making more like them, you are one of the big creators i watch where i am actually axcited to klik and see when you have a new video. have a good day chap. -your fan
I used to be a NVDIA and Intel "fanboy" for many years. Last year though i got myself a new gaming pc with the Ryzen 7 5800x3D and the 7900XT and i absolutely love it.
This really is like intel's bulldozer but if bulldozer was a actually competitive, what you got from bulldozer back then was: - technically more cores (treating threads as cores iirc) - higher clockspeed - higher power draw and more heat Now with intel you get: - technically more cores (through low power efficiency cores) - higher clockspeed - higher power draw and more heat Like seriously the parallels are uncanny when you think about it.
Thunderbolt 5.0 support in cpus is coming out too which makes external GPUs very viable now for many tasks that are optimized decently. Most tasks avoid gpu to cpu back and forth communication anyways because of the latency. Most tasks are a throughput problem. And thunderbolt 5.0 has reached very viable throughput imo. It very much shrinks the gap between laptops and desktops even more.
3:20 Ehh, I still prefer AMD by a long shot. My opinion hasn't changed over the last few years. In my country, a 7800X3D costs less than a 14700K and performs better in games while using less power and being less hot. In my country at least, you'd be insane to go with Intel.
@@2kliksphilip I didn't. In fact, I had an Intel CPU at the time when the Ryzen 1000 series came out. You didn't have to take what I said so literal. I just meant it as a general, casual statement, but if you must know, since the 2000 series then. I've been building PCs for more than 15 years and switched between Intel and AMD over the years. I'm just saying I still prefer AMD nowadays when it comes to the business side of things, when it comes to performance, and when it comes to pricing in my country. Don't have to be so defensive about it: I did not prefer Ryzen 1000 over Intel's 7000 series.
I love it when someone just shares their passion for computer tech. I started my PC with a 6700K from Intel which was great at the time, then I held onto it for probably four years before upgrading to the 5900X from AMD; MASSIVE leap in performance for rendering and relativelyy fair price. Now the market is so much more interesting for finding hardware that suits your needs, ranging from brute-force power to efficiency and punch
My PFsense router is hosted on an SFF PC I bought from a local government for $70. Still doing it's thing five years later. its amazing the value you can get on the used market.
Thank you for mentioning the 3100! I got the 3100 for 80$ back in mid-2021 paired with a 1650 and let me tell you that cheap CPU was more capable than I gave it credit for. I upgraded to a 5600 I got for 150$ and the difference was noticeable immediately yet I was still deep down thinking how incredible it was to get such a competent CPU for less than 100$ back in the market then and it ran so cool with the stock Wraith cooler never failing to reach max boost clocks. I sold that PC with the 3100+1650 to a friend with a steep discount and she works and games with it, it meets all her needs and that PC will fulfill its duties for years to come without a hitch.
Honestly this video really makes me hope that the next generation of Intel Arc ends up being their Ryzen moment, because the current GPU market absolutely sucks (as you mentioned), and with the recent AI boom, Intel has a lot of motivation to get their GPU division up to snuff.
No matter how powerful CPUs get my software bosses will keep saying "don't spend any time on performance, hardware is just grtting better anyway" and software will keep getting slower and worse
As someone who's not that tech-savvy, and is thinking of building a new PC, this video has helped me catch up on a lot of things, especially for CPUs. This video was a godsent for me. Thanks!
Don't know why, but I always enjoy hearing you just chatter about hardware. Even if it isn't the most information rich, sometimes I do learn something new and see a different perspective
I love hearing people ramble about technology, my favourite genre out there. Also can you kindly tell me the name for the background music you used, that kinda music goes perfect with such tech discussion vids, remind me of 2010s, good times.
i tend to make it a point to watch all new vids uploaded across your channels. i am halfway through this trying so hard to gaslight myself into thinking i care enough to finish it. i can do this
I brought a 5800x3d a few months ago, despite the 7000 series / 13 and 14th gen being out i still went zen 3 since i still had my old b550 motherboard and its been great! I don't feel like im missing out on much with the faster and newer gens, but after building my friend a 13900k system i can tell just how beastly the new intel CPUs are! (my last intel product being the i7 7700k) This market is perfect timing as well since im looking to upgrade my old gaming server and the Ryzen APUs and even the i3's of this gen are pretty capable in terms of gaming performance... Now if only NVIDIA and AMD would come down from their high horse of pricing in the GPU market... Great video as always Philip!
I upgraded from a Haswell i7 to the Alder Lake 12700K. Extremely overkill CPU, totally worth it, fast shader compilation and THE Intel CPU for streaming.
I still love my 5800X3D. That extra cache really makes sense when you try it in games. It has a huge effect on 1% lows, which has eliminated microstutters entirely. When Intel starts making affordable high-performance chips with a similar amount of cache, I will consider them an option.
Ever since I build my new PC I stopped looking at PC hardwares for a while, but every once in a while I love to watch your videos talking about these stuff
2:04 that's because your expectation of "easier tasks" are actually harder on the cpu than you thought, and since cpu doesn't waste time ever not being utilised at 100% in high demanding tasks, it's above your expectations. Easy.
I have to say Phillip no matter what the video topic is across your multiple channels I always find your videos very calming and they often help me sleep. I thank you
i've been using the 12100f for about a year now in conjunction with a 6700xt and can wholeheartedly recommend both, i havent had any trouble at all running games, unless its late game hoi 4 at max speed but still perfectly playable
have to disagree with intel being the high multicore performance option now. in my experience many high core count compute tasks can not properly take advantage of both P and E cores, which just leaves you with an 8 core 14900k that has some additional cores unused you can maybe use for background tasks. (CFD, FEA and photogrammetry are the big ones where I experienced that) if you use all the available cores the faster threads just just to wait on the slower ones and your overall performance goes down significantly over just using the P cores that's probably also why intels server and workstation offerings don't seem to use E cores
Did I really watch a 15 minute video about CPUs despite having better things to do and not that interested in building a PC? Yes. Yes I did. Why? Why do you ask? Well. Well simply, because I can. Because I can. Do I have better things to do? Yes. Am I really bored right now? Yes. Am I interested in getting a shiny brand spanking new CPU? Well, no. But then it raises the question.Why? Not the same why as before, but a new why. Why did I watch it? Because I can. Yes because I can, but I could've done something else. Was I interested in CPUs before? No, not really, but I am NOW.
Pleasantly surprised to see you don't have any affiliate links in your description at all, not that it matters but most people making these sorts of videos just make them as an excuse to pad their descriptions with a bunch of links
Build a dirt-cheap Ryzen 5700G-based Linux rig last year, mostly from used components, and couldn't be happier with it as my main system. It effortlessly lets me play Alien Isolation with just the integrated graphics and the bog-standard amdgpu driver that comes with openSUSE Leap 15.5. It compiles my custom kernels in far below 3 minutes and gets barely over 60°C while doing it (using nothing but the standard cooler that came eith the 5700G).
Cool video! It's really nice watching content that's positive, rather than the constant stream of negativity that goes around. Not to say if you were negative that'd be horrified and unsub forever feeling betrayed, just me mood right now.
The CPU market may be interesting now but it's potentially about to go red hot when they start shoving in NPUs. ...or not. There's no telling until we actually know if they're good or not.
I'd usually go for the cheapest 6 core but since there's now a non insubstantial amount of games that love insane cpus, I spend almost as much on them as gpu and don't really have much of a choice about it
Ryzen 5 5600 here. I paired it with a MSI B450M PRO-VDH MAX, some random, used 16 GB 3200 RAM and a used GTX 1660 Super. The CPU really is a blast, so glad I went with AMD here. Runs CS2 with medium settings and 1080p at about 180-240 FPS. For less than 500€ you can get a still very competitive gaming PC with this amazing CPU platform.
I will be excited about GPUs again when we can get 4090 performance for ~800$. Besides that, I think APUs are pretty exciting right now. I'm looking forward to the next generation of handhelds.
I dunno man, the release of OG Ryzen was the true golden age of processors. It forced Intel to not be so dogshit as they are now. Maybe even the release of Dual Core processors, true dual core. That also was an incredible time to buy!
I hadn't been following CPU development for about half a decade until I recently decided to buy a new laptop. After looking into the state of things I was shocked, I coudn't believe how powerful iGPUs had become. It was the first time I decided to pay extra money for a high-end CPU (I usually cheap out), because I just couldn't believe how powerful they are. I got a Frakmework with a 780M and it's ridiculous. It's about as powerful as the dedicated GPU I had in my desktop from 5 years ago was.
bought my 5600x for 100 brand new, which is crazy because i bought my first cpu (3200g) for around the same price and got more than double my performance
The really messed up thing about budget GPUs like the 3050 is that in practice they have about the same performance as a 1660 Super or TI outside of DLSS, and only NOW are matching those cards in price. That's kinda nuts. Three gens and multiple price cuts just to match was is basically just a neutered 2060.
the one thing that has gone crazy recently is the prices of motherboards and their features being limited to super expensive boards you used to get onboard power/reset and 9 panel readouts now you need to spend a fortune to get basic troubleshooting equipment
Epic ending 😁 I ended up using a R5 4600G for a buddy of mine. He's doing mostly office stuff, and the APU is overkill on both the IGPU side and on the 6 cores side. Funny enough, I would have went for the R3 4300G, but the price difference to the 4600G was just 14 USD. Either the 4600G was heavily discounted, or the 4300G was badly priced. Regardless, I felt no remorse spending the extra to get the 4600G, even though the 4 cores part would have been absolutely fine.
This is going to make me sound truly mental, but I prefer AMD exclusively because I don't like Intel's split of performance and efficiency cores. It's not really overengineered because you can't argue with performance, but it feels too complicated for me to want to get one myself. Instead I'd rather screw myself over with the 7900X currently in my system because at least I know what's up with that one lol
"They are borrowing from one another, but in different ways", I mean, patents are a thing you know... But I agree, CPUs are hella more exciting right now.
This video is exactly what i didnt need right now, i dont want to spend the money so im no longer cpu bottlenecked in every game. It would be so worth it tho, thanks philip
4:31 looking at avg fps is very misleading since the bigger prop. is low 1% 0.1% with "budget" cpus like the ryzen 5 in newer games! Atleast in my exp.!
Can confirm about the Ryzen 5 5600. I bought a 5600x (basically same thing) on release and it can do everything. It's crazy. The 5600G being that price is beyond superb for extreme budget gaming or for home PCs. I'm thinking of using it for my parent's build. And the fact that the 5800X3D is on AM4 just makes the 5600 even better.
I bought an Intel i3 10100 when it was relatively new for $100 flat, and it's served me well since. I can play all the games my ancient gtx1050ti can handle, I can record and edit videos, etc. Rendering the video takes a long time, sure, but actual editing process in Premier goes fine so I don't care. I could have upgraded to the i5 10400F for just ten dollars more at the time, but that cpu has a way, way slower clock and no integrated GPU in exchange for two more cores, and so far history has proven me right in that choice lol. I haven't needed the integrated GPU yet, but I feel confident I will some day soon as my 1050ti will probably finally die before I can afford to replace it.
Threadripper remains an unbeaten value-for-money proposition though, especially with the release of the latest gen one. Intel Xeon W is less powerful and more expensive, which might change with future architectures, but AMD isn't idling either. I agree, it's truly exciting to see REAL competition in the market. Now, AMD, please do something about nvidia...
The iGPUs are set to become incredible in the next few years. For example, Ryzen 7 8700G with the Radeon 780M graphics can beat out many, if not most brand new sub-$200 GPUs. Yes it's a $300 CPU, but if your budget is tight, you can think of it as a $150 CPU and a $150 GPU in one package. Also it saves you from buying a 5 to 7 year old GPU which could lose support any day now.
They don't beat modern sub-$200 GPUs quite, moreso 10 and 16 series GPUs at that price. I agree that for sub-$400 or so systems that need all-new parts, or for needing low power consumption or small size though, these APUs are very impressive. I just wish they'd put the good iGPUs on the 6 core CPU too. Also the $150 GPUs can be under 5 years old, such as the RX 5700XT, RX 6600, RTX 2060, RTX 3060, and so on.
@@jamesbuckwas6575 I guess GPU prices still vary a lot from market to market. Over here you'd consider yourself lucky for finding a bottom of the barrel 3050 for under 250€.
@@masterkamen371 You're right, I forgot about markets that have different pricing compared to the US, my mistake. For sure in those markets, APUs likely are cheaper than discrete GPUs from a couple of generations ago. In those cases an APU also makes a lot of sense to purchase, and to upgrade with a dGPU later!
@@jamesbuckwas6575 Yeah, prices are so ridiculous here that I'm going to other countries just to purchase PC components. Some GPUs are literally 2x cheaper that way.
I understand Philip never mentioning arm processors due to lack of availability for desktops/laptops and poor game's support, but that's what I'm must excited for. The next cpu revolution is the architecture change to arm, its lower power consumption have the potential to make computing more accessible (more performance for battery powered devices).
Zen C cores have less cache because you simply don't need that much cache at lower clock speeds. In tasks other than gaming, a Zen 4C core doesn't see any significant performance boost from 32 MB of L3. AMD small cores are just energy-efficient, space-optimized standard cores. They just rearrange a Zen 4/5 core so it can take up less space and cut out the features that are useless at lower clock speeds.
comparing the CPU and GPU industry is fascinating at how they seem to be direct opposites right now on the CPU side it feels like your hardware gets outdated instantly, someone who built a brand new PC with a 5800x3d at launch could have waited about a year and gotten a significantly faster 7800x3d instead for not much more money, no matter what CPU you buy it feels like not long later you could have gotten something better for the same price, you always feel buyers remorse, I felt this when I bought a 5600 only for like a month later the 5600x3d to come out (I do live near a microcenter) now for the GPU market its the direct opposite, it feels like each generation barely improves things but this has the GPU market have a much more evergreen feel, someone who bought a 2070 super almost 5 years ago still has a more then capable GPU that keeps up with modern options and still has modern features like DLSS and can do frame generation with FSR 3, its honestly why I honestly hope the GPU market stays where it is, sure its expensive right now but I'd rather spend $500 on a GPU that will last multiple generations then spend $300 on one that will be outdated in 1, people often get caught up in the right now, GPUs are so much more expensive then they use to be, but they tend to forget that a GPU from 10 years ago $300 lasted you maybe a few years, you'd spend that $300 again in a few years, but now you can spend $500 and hold onto that card for who knows how long, the gtx 1080 (non ti) is still going strong, even being able to run Alan Wake 2 with an upcoming patch shown off by digital foundry, thats a 8 year old GPU which predates ryzen CPUs entirely, and most of pascal even is doing well, the 1070 can even just barely run Alan Wake 2 with the previously mentioned patch digital foundry tested, thats a $400 GPU that is on track to last 10 years Edit: for context of how long pascal is lasting, if it reaches that 10 year mark that would be the time frame between the N64 and the PS3, imaging building a PC in the N64 era and having it last into the PS3/360 era
also a big plus point for the 5xxx series is that you can use some older mainboards, such as the msi b350 that i used for my 2600, by just updating the bios.
The GPU market in last few years had indeed been so bad that me and many people I know are buying new PCs in two parts: 1)whole new PC with mid to high end parts; 2)lowest available GPU that costs the same or MORE then the rest of the PC. Thankfully integrated video running from CPU nowadays can handle majority of older ie good games so this split is easy to handle without being "excommunicated"😅
I've been researching for a possible PC build for around 500 USD and honestly I was kinda shocked at how cheap (relatively speaking) a 5600 is, considering it comes with a good-enough cooler and AM4 motherboards are aplenty. Though well-priced used GPUs have been a bit of a tough find in my country, I'm pretty excited nonetheless.
The _big_ problem with Intel's E cores is their instruction set. E cores have never supported AVX512 instructions (and possibly more that I'm forgetting right now), which can be quite detrimental in multi-core workloads that require a lot of number crunching. With Alder Lake, the P cores _did_ support it, and having two types of cores with different instruction sets on the same CPU is also really annoying to develop for. Intel completely removed AVX512 from 13th gen and is now introducing AVX10 as a replacement that's easier to implement in their E cores, making everyone jump through hoops updating code to take advantage of it again. Meanwhile, Zen 4C cores support AVX512 just fine.
The problem people have doesn't seem to be with the e cores, it's with the lack of option for avoiding them on the high end or replacing them with 2 more P cores lol. I bought a last-gen i5 for my high-perf openwrt build to avoid them, since it's a pain when dealing with low-latency applications and virtualization.
The low end market is completely ignored by gamers so there's a lot of great deals to be had. My last purchase was a used Dell thin client PC with very low power consumption that handles my programming projects and my Playstation 1 emulator in a small form factor. I would've paid twice as much for a Raspberry Pi that isn't even half as powerful.