They’ll satisfy every mac user LOL, that was the point of every Macbook for years The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing
@@Jo21 Yes, but these are simply not. There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it. The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.
2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.
Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣 I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣
It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with
@@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.
I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!
This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!
I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.
If you want to actually see the promised performance gains: Use it for software development. Build times went from 4 minutes (2019, 16" MBP, max spec) to
I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing. - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-g1EyoTu5AX4.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-OMgCsvcMIaQ.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-JM27aT9qhZc.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-YX9ttJ0coe4.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.
These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.
I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data. How disappointing.
I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂
Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…
I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility
I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?
I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC… Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s
It’s also using a more efficient architecture for the GPU. Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor. This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application. Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.
It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.
@@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu
GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption
@@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.
@@evolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day
I love LTT but didn’t you admit to forgetting WoW runs natively in your first MacBook Pro vid, I suppose you forgot it again? Or just selecting results? Also why step up to the M16 for this one in comparison to your last vid when comparing a 14 inch laptop and not when you test the 16 inch version? And finally how about rerunning tests that are native across all platforms and then run the laptops with these tests unplugged to see how real world use would be.
so if he runs wow, and it does well, that does mean the laptop is now good for gaming?, dude put your bias aside, this test is to prove this laptop isnt good for gaming, even if it does well in one game. gamers arent only playing one or two games. if im buying a laptop it should run all apps. is macOS a phone OS or a computer OS?
@@truthseeker6804 no it’s not a gaming laptop. But saying it can’t play any games well is not true. Some people who use laptops for things other than gaming still like to play one or two games. Additionally there are other reviews out there by PC guys with fairer comparisons. If you’re going to compare something compare like tests. Asking me to high jump against an Olympic high jumper is just silly and so is running bench marks on machines that aren’t natively supported across all platforms when there is a myriad of benchmarks that are.
@@baldwanderer8188 yes some people play one or two games on pc, and im an example, but definetely everyone dont play the same one or two games. if he only did wow test and the macbook does well, many people are going to conclude its great for gaming and would do similarly in their games, which it isnt. so a test should show as much weakness of a device. if you want a fanboy channel that only show pros, there's max tech channel for that. asking you to high jump against an olympic jumper is a good test, because if i was in the market looking to buy a jumper i'd want to see where i get the best bang for buck. dont forget these macbooks are priced the same and even more than alot of these windows laptops that can do more.
I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel
I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia. I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.
@@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things
Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!
Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.
Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much. The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).
@@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?
I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone
It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.
Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility
@@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.
@@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price. It's not a fair comparison but you might want to lower your expectations on Apple claim.
@@Natsukashii1111 Laptop and Portable computer aren't the same. Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.
It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.
If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.
Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.
@@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.
@@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.
I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.
They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).
@@MLWJ1993 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple). The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.
@@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...
Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.
Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent. The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation. I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).
2 года назад
Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point. In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better. Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.
My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.
Things i still want to see covered: 1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle 2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta? 3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening
I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.
@Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.
@@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.
I like your tests and I am not an Apple Fanboy, but your results here are very different from most of the other Tech RU-vid channels that have tested these MacBooks
which other tech channel results differ from this? post a real tech channel, not a fanboy channel. before you post make sure that channel does a variety of reviews not only praising apple products.
@@truthseeker6804 All that I've seen actually. Here are some - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-g1EyoTu5AX4.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-OMgCsvcMIaQ.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-JM27aT9qhZc.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-YX9ttJ0coe4.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level.
@@andremessado7659 so i watched the first video, and the m1 max actually lost to the laptop and desktop, in the chart in export times, but it did well in the timeline playback, thats literally the same as this video in the davinci resolve section at 5:28. in the second video, the gaming laptop totally destroyed the m1 max on power, not on battery. i skipped the third bias max tech apple fanboy channel video. regarding the fourth video, the m1 max lost in all the charts except the 6k braw export, which is interesting because the first link you posted had a faster than the m1 max export speed on the gpu. so in summary from the first, second and fourth video, the m1 max does best in video playback on a video editing timeline, but loses to 3080 or 3090 in video exporting, stabilization, rendering, benchmarks, everything else.
Why didn’t you look at any games that are metal optimized? PC optimized games isn’t a fair comparison when looking at the GPU. Apple on day one said that GPU tests on GB were underestimating the performance. Power analysis to the GPU backs up these claims. What about software programming? About the 3d work in AE? Or other programs. I have also seen video editing comparisons that don’t use the video engine that indicates that the GPU is beating laptop 3080 PCs. Also some where the 3080 wins. Certainly if you need to AAA game, the PC will win today. Will the game manufacturers rewrite for Apple? Well, I wouldn’t bet on it.
Because there are is only one m1 native aaa game exists (wow - which is ancient), and the rest of the metal using games are not ported to arm, incl. tomb raider.
@@tompp2100 yes, but WOW Shadowlands performs very close to 12900k and 3080 on the desktop. So, it isn’t all that clear that GPU is more equivalent to a 3050ti. I am not a gamer, but I did development for a number of years and I know that taking different approaches to the same problem can many times reach performance gains of 2 and 3 times. So here when he makes comparisons that are handicapped in several ways, you shouldn’t expect optimum results or even realistic valid comparisons. I also agree that if aaa games are important, the Mac isn’t the way to go. But, if the game is properly optimized for the Mac excellent performance is possible.
@@drsmith1988 Not my experience. Wow has a quality setting between 1 and 10, and on 5, on a 1440p screen i get around 100 fps (m1 max). Going to native res, it drops closer to 60. The 3080 rivalling performance of 180 fps is only achievable on lower settings, but then the max runs out of battery in 90 mins or so. 32c gpu drains battery fast. You get better visuals on windows, especially on the desktop. Wow is the only game i know that is fully native. There are a few other that has metal support, but mostly ports done by Feral, and none of them are m1 native. Been a game developer for 20+ years :)
@@tompp2100 those numbers are way different than what Matthew Moniz recently posted. 180 vs 200fps 14” M1 max vs desktop Alderlake intel with RTX 3080. His other games that he shows are in the same ball park as what are in this video.
@@drsmith1988 that just shows how meaningless a benchmark is without controlling the parameters. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-KLh3V9NS-Ag.html
I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!
The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.
That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.
@@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into
I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.
I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.
@@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged. Battery life is good but not as important as Apple makes it out to be. It's not that important like phones. When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.
@@davide4725 “Thanks TSMC” You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao
I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.
@@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry. "Thank TSMC" lol. Kid please.
Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages
Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.
I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.
One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.
Ignore the Cinebench & Blender scores for M1s. The Embree renderer is Intel-controlled code & Github clearly states in huge bold text... Which M1s don't have. This is also a 'real world' workload that almost nobody will use in the real world as most will use GPU rendering - why wasn't the block size increased to accomodate the extra accessible RAM on RedShift? In fact, all this review seems to do is cherry-pick a load of sub-optimal (non-Metal, non-ARM) software which only demonstrates review bias. The ability to paint a deceptive picture with only 'the facts' is usually the territory of lawyers.
Seems strange results compared to so many other reviews. Why no Windows laptops in battery life test? All through Rosetta, etc... I love you guys, try to get back to your roots of fair comparison. Until then take good care.
I’m a software dev who also edit in Resolve and do some Blender on my spare time and went for an 10/16c M1 Pro with 32Gb RAM, I don’t regret it : I don’t intent to game on it, Blender Metal support is coming, and ho boy that XCode build time is just fabulous ! That’s not just the extra CPU, faster SSD and mem bandwidth do make a huge difference, easily cut my build times in half. picking an M1 Max was just wasting battery life for me, as the test shown in the video is the best case, the drop in daily workloads is more like 30%
@@StefanUrkel a huge one! Used both on M1 16Gb, very decent performance but swap usage was way too high and memory pressure often in the yellow area. 32Gb is way way better!
What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me
I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.
I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.
Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.
@@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.
@@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.
@@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx
@@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah
Oh wow I was watching this video and I couldn’t help it but wonder if I’m the only one being shocked with this review! I respect and admire Anthony but I believe the software used for these tests was cherry picked and doesn’t show the full potential of the MacBook Pro! I came down to the comments and I’m shocked not a lot of people talk about this and then I saw someone mentioning a channel Max Tech talking about this review. I went to watch it and indeed I agree with them.
@@truthseeker6804 you don’t sound very objective yourself. If you actually looked around on other channels, the ones that are “pc fanboys”, also did more objective tests that are similar to max tech.
@@solidsn2011 ive watched some videos from other channels and the results were similar as here. can you post an objective video you think differs from here?
@@truthseeker6804 I found this review lacking and leaning, specifically in real world function. Bropants, actually hook up all the monitors, play the games that currently work, put it against more PCs, more price ranges, yadda. I thought they did better on previous M Chip reviews. Meh, I can’t judge. I recently followed this channel because I thought it was actually called “Linux Tech Tips”. *I know.*
I’m primarily interested in the M1 for ML workloads. While it offers potential for edge applications most enterprises ML is on RHEL using the x86_64 architecture so the Intel CPU Mac’s end up being the better choice for a development workstation (windows just ain’t it, and find me an IT department willing to deploy native nix laptops. I’ll wait). That dynamic will probably shift over time,but I’ll probably treasure my intel MacBooks for longer than I should. All this is to say LTT: love your content, but would appreciate reviews with a data scientist perspective.
Sorry for such a straightforward question. My work is based on Java most of the time. Are the compiling times better on M1? Right now I'm on 10th gen i7 10700k. Should I upgrade to alder lake or M1?
The CPU stuff on numpy seems really promising. But training on GPU seems rather meh, at least in Computer Vision the performance improvements of using tensor cores is just way to large. Seems like performance per watt is rougly on par with a full power 3090 - not exactly a efficient GPU. But IMHO any GPU heavy workload is much more suited for remote servers than a laptop.
Your benchmarks are inconsistent. You include the 5950X/3090 in some of the test, but not all. I'm particularly interested in the Resolve results for this. Could you include that for the 16" review?
@@harsht6583 did you just call blender fringe and unoptimised? I don't even work or am intreseted in 3d modelling or rendering or creative work in general - I know about blender.
That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.
M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.
the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)
So, the short of it I'm getting is that Intel has just been intentionally selling bunkus laptop CPUs until the M1 came out and then they're like "Oh, you wanted a serious mobile chip?"
could you guys test war thunder when you do the 16 inch? it supports metal. is usually HEAVILY single core limited. and is quite interesting to test. pro tip. the build in benchmark has different results than using test drive.
Seriously? So, so tendentious. Using only not supported apps, not optimized games and some of…yyy apps that nobody use. Where is DaVinci Resolve (mac vs. Windows), Premier Pro, Final Cut, Photoshop. I love LTT content, but either you wanted to cause $hitsortm to hype some YT algorithms or make some more on ads… but you must be blind to not see how tendentious is this “review”.
Davinci resolve 5:29, final cut pro 4:40, did you watch the video at all? its funny you claim hes using non supported apps, is final cut supported on windows?. is it his fault that macOS is non optimized for most games?
Anthony, you tested Rogue Rogue Squadron III on a build from just before MMU support was added to JitArm64 :( That means it's likely running a lot slower than intended.
It doesn't really matter though. You can't even emulate those games at full speed all of the time on the most overkill PC setup money can buy. It just makes very heavy use of pretty much any trick you can pull off with the gamecube's hardware which is just not very comparable to current day hardware at all.
@@MLWJ1993 No? Modern x86_64 CPUs have no problem emulating Rogue Squadraon III. It's just that for any ARM64 chip, the Dolphin devs hadn't implemented accelerated MMU emulation until very recently, which made any ARM64 chip very slow in that game, even the M1.
@@neutronpcxt372 Pretty sure you run into slowdowns in all those games in transitions. It's not unplayable slow, but definitely not full speed everywhere. That's why the forums are full of people asking if what they see is expected behaviour for their overkill hardware or devs answering no hardware currently is capable to run the game "smoothly" when that's a specifically provided requirement.
Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.
And THAT should be a huge point actually. We're talking about Laptops here. They're meant to be at least partly used without being plugged in. And most Windows Laptops are SO much slower on battery while Macbooks just run the same and still last literally 3 times longer.
@@Raja995mh33 Yes but when you do huge workload you cab have a workstation or a desktop in the PC world, in the M1 world it is not possible. Also a desktop graphic card or cpu is more powerful than a laptop one and can destroy even more the M1 pro max. Mac user have no alternative to what apple gives them. PC user has many options. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-rQJTkWWkc0g.html&ab_channel=TallyHoTech
@@inwedavid6919 Oh wow dude a Deskop is more powerful than a Laptop. No shit Sherlock. We're talking about Laptop vs Laptop here! NOT Desktop vs. Laptop. That is the whole point. And the point is also that you CAN do these heavy workloads on a Macbook without a problem and it will do it just as fast and does not need like 10 times longer like many Windows Laptops that like to throw around big numbers that only count while being plugged in.
@@Raja995mh33 Is your country that starved on power outlets though? I've literally never had a problem with laptops, even has enough performance & battery life unplugged for the small commutes inbetween places that have a power outlet available to me 🤔
@@ezicarus8216 Just because you may not do that doesn't mean no one does. Many people just don't do that BECAUSE their Windows Laptop becomes useless at doing these tasks when it runs on battery. I know a lot of people that do work and also do heavy workloads on the go etc. and often on battery and all of them use Macbooks for that. And even when you're at home or whatever it's nice to know that you can juss unplug your machine and may work a while in the living room or whatever for at least 2-3 hours and you won't have any bottle necks or throttling.
I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.
Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.
Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.
@@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point. Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A. But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.
Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.
Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?
@@ziomalisty yeah that's true, but if we wanted to compare and test the real GPU performance of the M1 Max it wouldn't make any sense to not run games that are actually native to the ARM architecture
Divinity 2 for M1 is a mobile port with touch controls & heavily customised rendering backend to even run. Can't actually compare it to Divinity 2 on Windows since that's not Apples to Apples.
@@aymen5133 WoW isn't exactly a modern game so to speak. The M1 runs it well though. Sadly so does any other PC in existence. Using similar settings any Windows laptop will run into a CPU bottleneck before the GPU even gets to stretch its legs barring using "RTX", but the M1 can't use that setting 🤔 It really just demonstrates the point that being limited to the Metal API isn't really in favour of the M1. The actual performance results aren't really important for those scenarios unless you'd just want to know if it could run those programs. In that case the answer is that it can, just don't compare it to hardware costing 1/3th the price because that just reveals it as terrible value for *just* gaming as a purpose (and anyone buying an M1 should already be aware of that up front).
After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money
I know it’s not the same as a real thorough test, but most benchmarks agree that the M1 (any variant) run virtually identical both plugged and unplugged.
Absolutely, One of the main use cases for a laptop is while unplugged. The first test they should do is fully charged and unplugged performance testing, then while charging, and then when fully charged but plugged in. Large differences in performance can result in various situations, and only testing while plugged (or unplugged) can skew results to the testers desires. I think Apple's claims might be correct IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive.
This review doesn’t make sense at all. You clearly don’t use a mac. And you didn’t even mention the fact that the mbp doest all of this unplugged. I have a custom build pc, a mac pro and 2 macbook pro’s. These comparisons are weird af.
Apple always used video editing to showcase the M1 Macs--the problem is that much of that has nothing to do with the M1 architecture, but rather with the dedicated ProRes accelerators. Apple was happy to allow the misattribution in its marketing. Macs are, sadly, still poor value for 3D rendering. Something very similar happened with the older Intel Mac Pros.
for those who just want a quiet and cool running portable solution for primarily video editing and/or music production, and they need as much battery life as possible while taxing their systems while unplugged, and the codecs and plugins they need are supported and optimized, then the M1 Pro/Max would be very good for those use cases. for things like 3d rendering & gaming, better options and values can be found elsewhere.
Are you guys getting paid by Intel and AMD to make misleading tests? Wow, I'm a huge fan of LTT, but you guys really need to make an explanation video of all of these misleading tests.
@@truthseeker6804 No its not. I watched this video and was kind of disoriented the whole time. This is nothing like a proper comparison to benchmark the M1 Mac. There are so many other RU-vidr's that have proven how fast the M1 MacBooks are when using software thats optimized for BOTH systems. The M1 Max is on par with the 3080 running on optimized tests and the CPU is on par with comparable Intel and AMD CPU's (core count + thread count). So there is either bias or they are being incredibly lazy in their testing.
@@andremessado7659 the m1 series are optimized for video playback and prores exporting, thats about it. so from most test ive seen they do best in video playback on an editing timeline, but lose to many laptops and desktop in exporting times, for non pro res files, which is most files people use anyway. aside video editing, it doesnt do as well in many other stuff. its not really comparable to any gaming cpu or gpu simply because its optimized for only a specific set of task.
@@truthseeker6804 Maybe you just don't listen. We have enough data from other channels. And this comparison really differs. Those data are far away from those we have from all others. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-mNbitDYOezU.html
@@proschek im not going to waste my time on max tech apple fanboy channel. if you have another channel that is neutral and does a wide range of products you can post the link to that channel.
Very misleading review. I want to see real world (not really old) results with optimized software for both windows and mac, in other words, I want to see a fair comparison between laptops to get a better idea of what the new macbooks are capable of.
2020 M1/16GB has very similar performance in AE, Daz/ Blender to our 2016 3.6ghz, 6core i7, RTX2070 PC. However, Premiere 2020 is blazingly awesome on the M1 although the PC exports in half the time.