Main reason why nuc is better: it don’t have soldered ssd which is forced to artificially degrade over time🤣🤣🤣 So you can replace ssd, ram and gpu whenever you want in intel nuc!! Until apple will stop artificially degrade their products, pc is more superior!!
Seems to be not even close... The Dragon Canyon NUC 12 DOES seem like much better bang for the buck as the reviewer said. And it's future proof. Need faster CPU? Pop in a new compute element with 13th gen Core CPU. Need faster GPU? Pop in an RTX 4000 series GPU that is coming on the market soon. (or use external GPU over TB4) Need more RAM or storage? Just add them. I see no reason why a creator would choose the Mac.
"I see no reason why a creator would choose the Mac." You need to be one, to see/undrestand why. There are many reasons, including not bother with juggling with PC hardware, wasting time & money for upgrades, fix Windows issues etc. Most creators just want to buy computer, plug it in and work. And thats how Apple works. And the hardware is very powerful&fast, despite of useless benchmarks done with "NUC must be the winner" in mind of the Tech Notice owner. If you want fair comparison, look elsewhere.
I would get the base Mac Studio with the M1 Max at around $2200 and I would also buy the Nuc 12 at $3100. So I can have the best of both worlds. I would never buy the Ultra because I think their are design flaws in the Ultra chip as it never gets enough power sent to the chip to make it max out. The M1 Max I dont think has that flaw. The M2 Apple chip may correct the design flaw on the Ultra.
I just bought the Mac Studio Max, ver 2. After 30 years of PCs I finally gave in. My reasons, well I already have and iPad, iPhone, Apple Watch....so ecosystem. I no longer use the PC for gaming, and my Threadripper with its 3080ti burns electricity like a beast, my first electric bill after swapping was £75 lower in a month! Electric is not cheap any more, especially in the UK. My primary use is for video, music, etc. I could have probably managed with a Mac Mini, but I like a 'snappy' performer so went for something with decent performance. Re: upgrades to base model, I went for the 1tb internal drive, and doubling memory to 64gb. The drive for it's much faster speed and it was a reasonable price. The memory, it was expensive, but I run with three monitors and often have a lot of apps running. The obvious disadvantage, not being able to swap parts, but leaving out upgrading the processor/memory, storage is either NVME on USB-C sticks (as fast'ish as internal storage), or a Asustor NAS system used for storage and as a PLEX server. So the swapping out of parts issue really comes down to replacing failures, to cover this I just bought AppleCare+ which for the Max was circa £60 a year, renewing every year. All the monitors are straight from the PC, and all seem OK (no resolution issue), they are all 4K. I ditched my huge LED fest of a Razer mechanical keyboard and mouse and went for the Logitech MX mechanical for Mac, and the MX Anywhere 3 for Mac (I'm left handed). So far I am loving the new toy, and I have a much cooler (PC heated the top floor of the house when encoding video), quieter and spacious workspace, and as mentioned much lower leccy bills.
This is why I lament the death of hackintosh: I just can't find it in me to accept Apple'a direction and outrageous business model, no matter how impressive that silicon is. But man... I just like MacOS so much more than Windows, not to mention all the apps I can't port over... Talk about a shot sandwich, hold the bread.
Comparing an expensive, closed apple ecosystem locked, throwaway silver box they only give you a 1 yr warranty with that cannot be upgraded or repaired to a NUC that you can do pretty much whatever you want and run any OS you want including chucking in a gpu with it's own 4 year warranty. You have to be nuts to buy Apple
note the apple uses flash memory with no onboard controllermeaning no internal upgrade path My opinion is it is a waste of space as a work tool it is basically a ipad in a desktop kinda form factor
So the best thing on a windows pc is the gpu and when the gpu is superior that’s great, but when it comes to cpu performance, general speed, noise, heat and power consumption, in which apple is far superior, is not a big deal? 🤔🤔🤔
I think that is the whole deal with Apple Silicon: the difference is on the CPU part, where Apple's 5nm ARM based Silicon outperforms the much bulkier Intel based on Risc architecture. On the GPU side though the differences are not that big or none at all, since this is a completely different philosophy. There may be an advantage with regards to single, shared RAM, but that advantage is only there while there is enough RAM for everything. We bought an iMac M1 with 16GB and just an hour after first boot up it already used more than 8GB - hadn't even loaded anything else than a few web pages and tweaked settings. This here is a great test that shows that the "everything on a single chip" is not as dramatic a gain than what I believed it would be. But the low power consumption is cool (no pun intended: I think I have heard that the Ultra can actually get quite hot or at least it needs much heavier cooling).
It is easy, No M1 Mac can industry standard cad software that big, biiiig companies use... M1 is not pro at all before they solve that. Why only movie/photo editing software? so only youtube workload is pro? :P
@@johnforde7735 how is that a problem for the cad software companies? every engineer/cad:er use the hw that runs the software. all things in the world are design in cad, a big piece of the pie apple is missing out not approaching the companies.
@@Pillokun Yes, lazy CAD and architectural software companies seem content with only supporting Windows machines. I guess they have old codebases and aren't interested in making cross-platform software. It's up to them. Apple have nothing to do with this. Do you think that Microsoft approach CAD companies? They don't.
@@johnforde7735 Microsoft dont need to aprouch them because it is an x86 compatible platform and already adopted since ancient times :P And yes, companies usually dont update their stuff until they know everything works without any major kinks. Imagine working on a project and because of an bug or compatibility issue the project is lost/scrapped and 10ths of thousands of jobs might be in danger of that issue. And to be honest those software suits are buggy enough as they are now :P It is in Apples best financial interest to get that working on m1 macs. It is an massive business where everything needs to work as flawless as possible but I guess apple dont have that much faith in their own hw/software. Lets be honest here, movie/photo editing is not on the same level of seriousness as engineering.
@@Pillokun Yes, I know why Microsoft don't approach them. "Imagine working on a project and because of an bug or compatibility issue the project is lost/scrapped and 10ths of thousands of jobs might be in danger of that issue." That's just not reality. Look at Adobe, Autocad (to an extent), Microsoft, Da Vinci, etc. They all make software that works on all platforms. It's not actually difficult. But it takes a certain amount of effort and will to acquire a cross-platform capability. "Lets be honest here, movie/photo editing is not on the same level of seriousness as engineering." Not sure what the point that you are trying to make here. Engineering software and video editing software are both productivity tools. Both have to be reliable or else people using it won't be productive. It's not in the same category as software that flies planes. Flying planes is serious. Also, it's not any more difficult to port video editing software to be cross-platform as it is to port engineering software. It really makes no difference. And if you need the software to be more reliable, you adopt best CI/CD and QA practices. The same as when you don't have cross platform software.
Comparing a $5000 machine to a $3000 machine is not really fair or logical. You could put together a $5000 pc with a 12900ks, 3090 ti, and gen 4 ssds running at 7000MB/s, and call that a fair comparison - a comparison I'd like to see. But this video is like saying your Tesla is faster than a Toyota - well yeah, but it cost two thirds more. You gotta do a dollars-to-donuts review, dude.
I actually agree with you except i would say this comparison is like the Tesla Cybertruck vs a Toyota truck. The Toyota truck costs half the price of the Tesla but can basically do everything the Tesla truck can do performance-wise if bells and whistles arnt important to you, thats what this video is saying. If he did dollars to dounts review the desktop macs would lose every single time. It wouldnt make an interesting or surprising video. Mobile on the other hand thats another story.
Who on Earth is going to start learning video editing on a $5000 Mac Studio. Anyone considering the Mac is going to already have a preferred video NLE. That copout for not benchmarking in Resolve is weak sauce.
why does no one comapare for Audio production,,video,video,video,,, there's millions of home audio geeks out here. the extreme appears to run at 2.5Ghz?
Next up: Mac Studio vs i9 13900K - raptor Canyon 🤩 but when do we see that video 🤟😅 And will the 13900K thermal throttle? And thanks for all these great videos.
It really depends on the use cases. If you are a video editor using ProRes codecs, any M1 or M2 solution will beat just about any PC out there - especially in the same price class. I use both and I love both for their own advantages but I'm a computer/software engineer. I really appreciate the built in unix-like subsystem (BSD core) of Mac with a really robust UI on top. Any software development that doesn't require virtualization or docker - highly recommend Macs - especially for anything mobile. Mac make better laptops. I used to really favor Macs since I once could run all major OSes (Linux, Mac and Windows) on the same system - this is no longer the case (technically you can, but Windows on Arm is a cruel joke). Now, my laptop is a Mac and I use a desktop PC (custom build) for everything else.
That’s the review I been waiting for , cause I want to buy a new mini workstation Where would you put the normal Mac studio , performance wise compared to the nuc?
Well, technically the NUC is a "mini PC", not a "Workstation". A true "Workstation" would be a dual Xeon or a Threadripper Pro., which easily duplicate those peformance numbers in every single aspect and can be upgraded to any storage size you want, multiple GPUs, etc . However a true "Workstation" would be more expensive than the Mac Studio, but not by much.
@@LuciusGraciusMaximus Well it depends on what you plan to do. If the power consumption in KW/h is half, but it takes 5 times more time to complete the same render (as it happens with GPU intensive applications) then the Mac Studio needs much more energy than the PC. But of course it depends on what you do with it. For simple tasks it is more efficient.
I am actually astonished that the not-everything-on-a-chip Intel system can be so competitive in speed. I know the power comes at much higher wattage cost, but this thing is serviceable and to quite some degree upgradable and I always believed that the replaceable/discreat RAM and drives would ruin the picture much more. I was considering the Mac for development and simulation work and just did not want to go for a huge tower again.But this NUC extreme is smiling at me. Almost same performance at a much lower price and ... much more repairable.
did you get a NUC? im a Mac baby and am starting to rethink my life choices since their silicon launch. Surly best of both worlds is to hackingtosh the nuc for everything except AutoCad and counterstrike? :)
@@poonholder5643 I finally bought one. Just waited for prices to go a bit down and for months you could not even get one. Extremely satisfied. Will never regret to have chosen this over a Mac Studio. For now I run it with 64GB RAM, two 1TB SSDs, and a NVIDIA 3060 MSI card. Super speedy, 3D models flip around as if it is nothing, all apps load extremely fast, switching between them is swoosh. All of that for less than the Mac Studio start price. And I can connect 5 or more monitors (not that I need to) The RAM is maxed out but I can add one more SSD and can swap them and the graphics card to even more capable versions. And while I own a few Apple devices, I do love Windows for productivity and coding. I even like Windows 11. The only crappy experience was the initial installation. Windows 11 did not know this system and did not have working drivers for the network adapters. Neither LAN nor WiFi would work, but Windows installers expect you to join a network at one point. So if you do this, make sure to prefetch the drivers from Intel, place them on the USB stick for installation and hopefully be able to force Windows to look for them at the right moment. Alternatively you need to bring the installer in “jump over network” mode and install them manually later. If you are not familiar with this, best get help. Otherwise super frustrating. How Microsoft still doesn’t find elegant ways for this is beyond me. Intel is one of their main partners and there should be a basic network mode just to get beyond that point, however modern and unexpected the hardware. But … once this is overcome the Nuc is an extremely cool and potent device. Oh yes, after shutting down the headphone output gives a nasty hum with my amplifier. Will need to figure out if I can avoid that.
That's when you sell your old Mac for a decent price (because Macs keep their value) and buy a new one. And let's face it, you'll buy a new NUC at that point as well because you can't upgrade the CPU in your old one. The problem with your NUC, is that you won't get much money for your old one.
@@johnforde7735 actually it is not a reason to sell and buy a new one. I should be able to easily add a new drive. (Apple has space for two) and / or upgrade the existing for more storage. My PC has plenty of space to put in new drives, and I can upgrade each of my existing as time requires. I don’t need a single chain for external storage. And the only component that would need replacing in the next 3 years (way longer than the time required to upgrade / get more storage) is the chip as Meteor lake will be replaceable in a NUC where the SOC on a Mac is not
@@FEGTTTSDH and that use of external usb SSD reflects the dongle chain and can cause bottlenecks. As for windows updates I have no issues using windows 11. Also have updates consistently with Apple (I use both an iPhone and iPad for on the go).
I don't think this is true. Difference in power is maybe two light bulbs worth of difference when running full power. You would have to have a very high power costs to use €500 in a year.
with the prices of KWH in Europe right now, the mac studio performance/watt is so superior it is not even funny… and if you take into consideration the malware OS called windows , the choice is clear…
Dont question the legitimacy of the sanctions which are really hurting Putiputes and his (strongest of all currencies this year) Rubble! Whilst us wokies scurry around paying .30+/kwh and debate wether a woman has a clitoris or the price gap between two compared electronics is acceptable.
It's really very simple. If you like MacOS, you're going to buy a Mac. If you like Windows, you're going to buy a PC. Doesn't matter who is faster, prettier, quieter, more efficient. It ALL boils down to your OS preference.
Depends what you do, I was a Win user, but when I was try a M1 Mini the snapiness and the noise was death silent (I am a hardcore video and music person) was a massive upgrade... I miss games but well, GPU prices 6 months ago in the sky level doesn`t help me to improve my previous I7 8700K.
@@gardriel8208 I can agree with your point, but Apple is too expensive to "try". I do have an iPad as well and I enjoy it, but as far as computers. Never.
Always great content. Sir i got 288 scores of speedometer 2.0 in 12600k using chrome and in edge 278. My big PC is too much better. You have suggested 12600k for lot of people. I cannot able to believe that i own such a powerful PC at very less price. I am very very happy. Thanks alot sir. What will happen when nvidia launch rtx 4000 series with lot of 2 times better optimisation and also with better new encoders and decoders and also amd new cpu and intel 13 gen are also coming
@@thestroller7142 The video should be about the NUC 12 then, but I suspect it would not get as many views. The Mac Studio is a legitimate product that has people very interested. No amount of self denial can change that. The PC is NOT "better" it's now a matter of preference which is a huge score for Apple.
Although the Mac GPU is slower it is worth noting the unified memory provides a significant advantage for rendering heavier scenes. Having just purchased a RTX 3080 with 12GB I find it pretty easy to fill that VRAM just doing high quality product shots that may employ a lot of large textures, or CAD geo with high poly count, displacement etc. Especially as the same card is being used to drive the monitor and other system functions.
A little thing here. Power consumption in desktops does make a difference in certain situations. Power equals heat. I am in Las Vegas. It can get over 29 ambient temperature in my apt. This is not an exaggeration. I can use a/c to lower the temp to the low 20s, but my electric bill quadruples. The body acclimates to the environment. The first year in the desert, I never left my apt. Being from England you would probably be the same. The last year I had a business where I worked outside all day. Between fans and my body being fairly acclimated to the heat, I can keep my apt in the high 20s and be comfortable. In this environment an i-5 will work. An i-7 is fairly iffy. An i-9 probably no. I am considering a studio because of the low heat output. The price really makes me think though. The Ultra does look better than the NUC, but does it look 2,000 dollars better.
at the moment you'll find the 48-core M1 Ultra performs similarly to the 64-core but as the software gets optimised (and it's rolling out quite well) the 64-core will show it's value.
One point nobody mentions is resale value in lets say 3 years time. PC resale could fetch you 30% ish if you're lucky to sell it, mac on the other hand can fetch you up to 70% and usually extremely desired on the 2nd hand market. You loose more on a PC in a long term... just my own experience. And I have to admit using both PC OS is much more archaic looking and feeling. That's my biased opinion☺💜
Hey just curious. Do you have an update? One year later has BMD resolved the noise reduction issues on Apple Silicon? Is it an optimization problem or just a raw GPU power issue?
@@paisenpaisen the point is, that the whole AM4 socket, in addition to the memory will be replaced in a few months. PCIe 5 is also around the corner and DDR 5 prices are already falling. With AM4 you are locking yourself in with no upgrade path.
@@theTechNotice on the other token, are people really going to upgrade the CPU after just one generation? The real magic is after two or three (or four!). But I guess if you make a lot of money with the thing you would. Intel motherboards don't usually last more than two generations
Wow! As soon as you hear that more, older, slower and fewer modern, fast ports = better, you know there's bias. Blender 3.3 Alpha shows how suboptimal Blender 3.1 Benchmark is and the M1 Ultra at full tilt should be pulling 180-200 Watts not 73 Watts. Why the non-native photoshop and not the native Lightroom Pugetbench? Too good? The point on chasing the hardware to support the software is right - except when the different platform provides a different library. Then you owe it to yourself to look again. Now try the SSDs once encrypted. At the moment (thanks to suboptimal software) a 48-core GPU Ultra will be more comparable with the NUC at a much closer price point. Comparisons are tough, most cross-platform synthetic benchmarks are still measuring discrete component performance (CPU OR GPU) whereas M1s are unified (CPU AND GPU AND MEDIA CODECS AND ANE AND AMX) so the benchmarks will always show a partial truth. I'd like to see unified benchmarks (like Affinity Photo Combined score) which you can then limit to a specific component if you must. That tells a very different but more accurate story.
… and this is the first generation Apple silicon! And it’s an integrated GPU. Technically apple silicon can use dedicated GPU in future generations. X1?
Apple cut ties with Nvidia so I doubt they'll support Nvidia GPUs and Drivers. Also i doubt they will support AMD's next gen GPUs either. They will use their own in-house GPUs for profit than giving it to AMD. As far as performance is concerned, I doubt Apple will be able to keep up due to being integrated with the CPU. The 3090 is faster, the 4090, RX 7900XT are going to be even more so.
@@whitehavencpu6813 This first generation Apple Silicon don't have dedicated GPU just to prove a point. I am pretty sure the version that goes with the Mac Pro will be able to support external GPU. Remember, this is gen 1. First iteration! M1 and M1 Ultra are basically the same in single core!!
@@nyambe The question is which GPUs will they support? Nvidia is done with Apple, so no hopes of plopping an RTX 4090 on the next Mac Pro. What about AMD? AMD divided their lineup into CDNA and RDNA for productivity and gaming respectively, with each line having their own separate drivers - I don't see Apple moving in to support RDNA as Apple seriously isn't a gaming focused brand, as for CDNA it is pointless even if they do support it, because it cannot game. The only thing left is for them to make their own line of dGPUs. Don't expect them to be anywhere close to the performance of AMD and Nvidia's next gen cards, at least where gaming and rendering is concerned. Most of the advantage that the M1 ultra's iGPU got, came from it being closer to the processor and its gigantic unified memory. Moving to a dGPU setup could negate these advantages completely.
@@whitehavencpu6813 When a new technology disrupts the market is called a paradigm shift. Examples: tesla vs gas cars cars, the iPhone vs blackberry or mirrorless cameras vs DSLR. When this happens you can't really predict future performance based on past technology. A car is NOT a faster horse. Apple silicon is a paradigm shift, very soon we will see things that traditional GPUs wont be able to do because of thermal limitations