Тёмный

Stable Diffusion - Mac vs RTX4090 vs RTX3060 vs Google Colab - how they perform. 

Render Realm
Подписаться 3,2 тыс.
Просмотров 29 тыс.
50% 1

Опубликовано:

 

4 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 66   
@cmdr_stretchedguy
@cmdr_stretchedguy 2 месяца назад
My experience with my PCs: Win10, Fooocus 2.5.0 (SDXL 1.0), 1024x1024 image, base "realistic" preset, no other models or LORAs. Latest GPU drivers installed. Ryzen 5600X, 32GB DDR4-3200, 8GB RX7600 with PRO 24.Q2 drivers, Fooocus AMD version: avg 261 seconds per image Same PC with 24.7.1 gaming driver: 318 seconds per image Same PC with ComfyUI ZLUDA, PRO drivers, same model and settings: 340 sec per image Ryzen 5600G, 16GB DDR4-3200, 8GB RTX3050, standard Fooocus CUDA version: avg 33 seconds per image Newegg has a 12GB 2060 (refurb) for $170 so I have it on the way for additional testing this week.
@Kowalski301
@Kowalski301 6 месяцев назад
Great video! Another great budget card these days is the 4060TI with 16 GB.
@Stvcloud
@Stvcloud 4 месяца назад
The problem with that card is the memory bandwidth, apparently a 3060ti 12gb runs much better and probably you can pair two of them for 24gb of vram, that could probably reduce the issue with vram size
@ronuhz
@ronuhz 2 месяца назад
The mac’s speed can be improved by running macOS Sequoia and by converting the Stable Diffusion model from Safetensors (or any other type) to CoreML so it can run on the Nerual Engine instead of the GPU or CPU. These can vastly improve the performance.
@Gabriecielo
@Gabriecielo 6 месяцев назад
The problem with Mac is till now, they still only partly support pytorch, and speed is still as slow as you tested. I'm using a M2 Mac Studio with 32GB ram.
@arturabizgeldin9890
@arturabizgeldin9890 9 месяцев назад
Thanks for this comparison, really helpful!
@psynchro
@psynchro Год назад
So helpful! This is exactly what I would have done had I set up a test to make a decision. I have the same macbook as you and have only tried on it and my RX570 on a PC system using A1111. Since the 4070s went on sale, I have been asking myself if I should upgrade my PC. From Tom's hardware benchmarks (which are not as helpful as yours) I can see the 4070 would probably be between the two Nnvidia cards you tested. Very very helpful, also doing SDXL and hires x 2. Thanks!
@deneguil-1618
@deneguil-1618 8 месяцев назад
the 4070 is indeed between the 4090 and 3060 for AI but much closer to the 4090. Official specs for all these cards are 104 TOPS for the 3060, 1321 for the 4090, 466 for the 4070 and 568 for the 4070S. For comparison, a 2080Ti has 114 TOPS and the 3090 285 TOPS although much more memory than every card other than the 4090
@Slav4o911
@Slav4o911 7 месяцев назад
For SDXL at 1024x1024, 3060 doesn't have enough VRAM that's why it performs bad. I use either 800x600 or 1024x768, so the model doesn't go above the VRAM limit and it performs quiet well. For the SD 1.5 models, RTX 3060 doesn't have these resolution problems and doesn't go above the VRAM limit, regardless of what you do.
@mrquicky
@mrquicky 6 месяцев назад
I love the comparison and pretty much agree with your conclusion. Having seen a previous comparison, I'd go with the Mac M3 Max as the top tier choice. The 4090 could just barely keep up with it.
@mrbabyhugh
@mrbabyhugh 5 месяцев назад
7:57 i been using colab to get to familiarize myself with SD. I feel I am now enough and want to start generating locally. I don't want to invest too much initially, so I am trying to figure out what's the best low with good enough performance and I thought the 3060 would be a good start, nothing less. Your test suggest that, now I trying to decide between RTX 3060 vs RX 6600 vs Arc 580 >. I am next going to see if Arc can generate and if so, at what results. 5700XT should be good too, but for SURE the power consumption will be crazy.
@relativityboy
@relativityboy 11 месяцев назад
Perfect video. Thanks for the information!
@msEllegant
@msEllegant 3 месяца назад
Thank you.. exactly what I was after..😅
@WabiSabiVibes_
@WabiSabiVibes_ Год назад
What is your opinion on 3090 GPU? It is also very efficient and has perfect price/perfomance ratio (especially used GPUs) and 24 GB onboard
@wwk279
@wwk279 Год назад
~750$ for a used 3090 is pretty good deal, i was planning to get rtx 3080ti 2nd for the best p\p compare to 3090 but changed my mind when i saw a SDXL gpu benmark video, image set at 1024x1024( standard image res for the best quality result in sdxl), it can go up to 17.2 Gb Vram consumption that is insane, rtx 3090 can go well with it for several years
@JoernR
@JoernR 10 месяцев назад
@@wwk279 - You did right. The 3080 Ti is a great overall performer, but it's 12 GB VRAM are a real bottleneck if you wish to use SDXL plus refiner or other VRAM-heavy combinations.
@leonardommarques
@leonardommarques 10 месяцев назад
this is exactly what I needed thanks
@NeonPixels81
@NeonPixels81 6 месяцев назад
For some reason, I get much better aesthetics on my RTX3060 Ti than my RTX 3070, even with all of the exact same settings. Can't figure out why for the life of me.
@minhngoctran7271
@minhngoctran7271 Год назад
very nice video. this is what i need. Maybe do a comparision between AMD, Arc and Nvidia if you can. I thought of buying a 3060 12gb for SD, 4060 16gb looks nice but i will wait for 5060 or better if they have more VRAM.
@klaustrussel
@klaustrussel Год назад
I'm also wondering how the 4070 would perform in comparison to the 3060
@Ir-jq5kb
@Ir-jq5kb Год назад
4060ti 16gb has really bad price/performance ratio. Very power efficient though
@Steamrick
@Steamrick 9 месяцев назад
@@Ir-jq5kb If you only care about Stable Diffusion, the 4060Ti is a good entry card because it gets you 16GB VRAM for a relative budget. The other option is a used 3090. VRAM is incredibly important for Stable Diffusion because the moment it has to start pushing stuff to system RAM, performance tanks.
@theocramez
@theocramez 8 месяцев назад
I would have LOVED to see AMD and Intel in this mix as well, +1
@deneguil-1618
@deneguil-1618 8 месяцев назад
@@klaustrussel following official specs over 4x faster, 3060 has 104 tops and 4070 466. It seems like Nvidia put a lot of effort in upgrading the tensor cores for the 40 series cause even the 3090 ti (3rd gen tcores) has 320 tops and the 4060 ti 353
@skozmin
@skozmin 3 месяца назад
thank you, very very much
@tripleheadedmonkey6613
@tripleheadedmonkey6613 Год назад
Try running ComfyUI on your MAC. I've heard support is much better. Also, using the Stable Swarm fork of Comfyui you can use multiple PC's and GPU's in parallel processing.
@losing_interest_in_everything
@losing_interest_in_everything 11 месяцев назад
Multiple PC ? Could you please clarify your statement ?
@JoernR
@JoernR 10 месяцев назад
The issue with ComfyUI is that it is even less comfy than A1111.
@Slav4o911
@Slav4o911 7 месяцев назад
@@JoernR It should be called unComfyUI.
@DailyProg
@DailyProg 10 месяцев назад
More of this please
@forphp8233
@forphp8233 9 месяцев назад
Thank you!
@SunnyWinterz
@SunnyWinterz 10 месяцев назад
Great video comparison, however it's worth noting that Macs with M2 chips are significantly faster with Stable Diffusion, and I imagine the M3s will be insane. Some people have shared their SD M2 performance online such as Reddit. So yeah, don't get an M1, but M2 or M3 are excellent for SD.
@why__die
@why__die 10 месяцев назад
amazing video, thank you. would love to see a comparision between the new m3 chips!
@haris6523
@haris6523 5 месяцев назад
I didnt understand about colabs, does it has limit on generated image per month?
@venue44
@venue44 7 месяцев назад
thanks for this!
@zedboiii
@zedboiii Год назад
does colab free generation time slower than the $10 subscription? i have been using the free version for a while and still can't decide if i want to subscribe. im not a heavy user just generating image as a hobby
@ozstockman
@ozstockman Год назад
I run it with the same rtx 3060 12gb and it is good but my processor is AMD 7950x. BTW, RTX4090 costs less than 2000 euros and rtx3060 should be about 300 euros. The price you mention is a way too high.
@-RenderRealm-
@-RenderRealm- Год назад
Well, these were the prices for the whole system when I bought them - not just for the GPUs, but also CPU, memory, motherboard power unit, etc... so for the whole machine. And, unfortunately, Austria isn't a cheap place for buying hardware, I'm sure you could get it for less money in the US or other places.
@JoernR
@JoernR 10 месяцев назад
@@-RenderRealm- - Dürfte überall billiger sein, wo nicht die EU-Seuche wütet.
@cakrulgaming
@cakrulgaming 2 месяца назад
rtx 4090 vs rtx 3060, with more than 5 times more expensive. we would only get 3 times of performance.
@4kaSOSiso
@4kaSOSiso Месяц назад
In other benchmarks rtx 4090 should be have price $1500 instead.
@coloryvr
@coloryvr Год назад
Big Fanx!
@yesmilan8795
@yesmilan8795 6 месяцев назад
good job
@SyamsQbattar
@SyamsQbattar 5 месяцев назад
Which one is better, the Rtx 3060 12GB or 3060Ti 8GB?
@aphotographerspodcast
@aphotographerspodcast 4 месяца назад
12gb. ram matters more for this.
@zoulouad5706
@zoulouad5706 8 месяцев назад
Good morning, i'm just wondering how much time the RTX 3060 took to generate a 1920x1080 image ?
@Slav4o911
@Slav4o911 7 месяцев назад
You don't generate high resolution images directly in Stable diffusion (because it will output bad results, the hardware doesn't matter), you use an upscaler. It works well with SD 1.5, for all it's recommended resolutions. But for SDXL it doesn't have enough VRAM so it works slower in resolutions higher than 1024x768 and 800x600. Also with SDXL it goes above it's VRAM limit, when you use a control net. I think for SDXL, you'll need an RTX with 16GB of VRAM, otherwise the model will slowdown once you go above the VRAM limit... it'll still generate images, but would be slower. There are different techniques to upscale images, so 3060 can upscale images to 8k resolution and above.
@aniln7452
@aniln7452 22 дня назад
​@@Slav4o911Hi I'm a college student planning to do my ML project. What's your opinion on using ryzen 7600x processor, 16gb Ram, rtx 3060 gpu for ML projects particularly for medium LLM model. Can it handle it because of its 12 gb of Ram ? Can I retain my pc without destroying it...
@gigend
@gigend 7 месяцев назад
how about 3060 12 gb vs 4060 8 gb?
@FielValeryRTS
@FielValeryRTS 3 месяца назад
3060 12gb better for Stable Diffusion.
@wingofwinter888
@wingofwinter888 Год назад
google colabs is not free anymore
@fajn
@fajn 7 месяцев назад
Are the Macs still so slow?
@gorovitz
@gorovitz 7 месяцев назад
This mac price is the same as just one 4090, so it's ok
@thiago8497
@thiago8497 3 месяца назад
Hello. help me with choosing a video card for the difusion stem. I have a choice between rtx 3060 12gb, intel arc 770 16gb and rtx 4060 8gb, what do you advise?
@ArayaRevilo-uf3sb
@ArayaRevilo-uf3sb 2 месяца назад
3060 12gb, prioritize what has the most vram
@bebert0712
@bebert0712 10 месяцев назад
M1 for artist XD
@havemoney
@havemoney 7 месяцев назад
3060 8gb or 12 ?
@FielValeryRTS
@FielValeryRTS 3 месяца назад
12gb. Better.
@danielpedro2867
@danielpedro2867 7 месяцев назад
with the price of one 4090 you can buy 4 3060
@x0vg5hs1
@x0vg5hs1 4 месяца назад
"smack lips" Mac is slow. I agree
@mick7727
@mick7727 5 месяцев назад
This didn't help at all. Why not use the same amount of RAM with different GPUs? or do a test with varying RAM? You've got too many variables.
@FielValeryRTS
@FielValeryRTS 3 месяца назад
Those variables didn't matter much. He used 32gb in two, 64gb for 4090 since it might be useful for the powerful gpu, and the last one used Google Collab, so 16gb is enough.
@WallkarLab
@WallkarLab 8 месяцев назад
Do not purse your lips.
@NextGenAge
@NextGenAge 5 месяцев назад
Mac overpriced !
Далее
PL Ch07 Rec01
1:08:24
Просмотров 356
Mac vs Windows in 2024? The Truth that Shocked us..
10:25
PERFECT PITCH FILTER.. (CR7 EDITION) 🙈😅
00:21
Просмотров 3,5 млн
V16 из БЕНЗОПИЛ - ПЕРВЫЙ ЗАПУСК
13:57
FREE Local Image Gen on Apple Silicon | FAST!
9:33
Просмотров 47 тыс.
I tested every Celebrity Tech product!
27:15
Просмотров 360 тыс.
REALITY vs Apple’s Memory Claims | vs RTX4090m
8:53
Просмотров 174 тыс.
Do we really need NPUs now?
15:30
Просмотров 607 тыс.
Apple's Fastest Mac vs. My $5496 PC
14:55
Просмотров 2,5 млн