Тёмный

Progressive Distillation for Fast Sampling of Diffusion Models (paper sumary) 

DataScienceCastnet
Подписаться 4,6 тыс.
Просмотров 7 тыс.
50% 1

Having a bash explaining arxiv.org/abs/2202.00512 (and the follow-on paper arxiv.org/abs/2210.03142)
I've been busy working on johnowhitaker.github.io/tglco... and hoping to record a bunch of videos for that soon - this informal video is mainly just a way to test the recording flow and get me back in the zone for making videos :) Nonetheless, I hope it is interesting and useful! If you have questions please leave them in the comments below.
PS: For the curious: η is a lowercase 'eta' but that kind of knowledge vanishes as soon as the camera is on me!

Игры

Опубликовано:

 

10 окт 2022

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 12   
@yonistoller1
@yonistoller1 Год назад
Hi, thanks for this great video! Do you know where the weight of the loss function (w(lambda)) is derived/explained? I'd like to better understand it, and whether I should use it.
@Adreitz7
@Adreitz7 Год назад
Thanks for the explanation. I have a question, though. After training the student model on two steps of the teacher, why do they make the students the new teacher rather than continuing to train the student model on FOUR steps of the original teacher (and so on). My naïve feeling is that training student 2 on two steps of student 1 and so on could introduce performance loss relative to the teacher.
@maxim_ml
@maxim_ml 11 месяцев назад
it does optimize the training though
@haikutechcenter3349
@haikutechcenter3349 Год назад
Thanks for a great paper overview. Do you know if anyone has applied progressive distillation to stable diffusion? Is this something Stability would have to do, retraining the entire model? Or could there be a way to do transfer learning on the existing release stable diffusion model?
@datasciencecastnet
@datasciencecastnet Год назад
You could begin with the current model and apply this progressive distillation to it - I believe work is in progress on that :) There is also a WIP pull request for it in diffusers: github.com/huggingface/diffusers/pull/1010
@azureprophet
@azureprophet Год назад
Pretty certain that comes out in a week or so.
@datasciencecastnet
@datasciencecastnet Год назад
@@azureprophet the paper has just been updated and now covers stable diffusion, so agreed we will be using this very soon I'm sure :)
@kornellewychan
@kornellewychan Год назад
greate
@rewixx69420
@rewixx69420 Год назад
Can you explain ddim paper on someone in cometnts ?
@datasciencecastnet
@datasciencecastnet Год назад
I'll take a look and see where I can slot it in :)
@rewixx69420
@rewixx69420 Год назад
I dont undestant the paper,blogs and code implamation in all of them are diffrent stuff.
@idan957
@idan957 10 месяцев назад
11:33
Далее
Stable Diffusion Deep Dive Notebook Run-through
41:09
How I Understand Diffusion Models
17:39
Просмотров 25 тыс.
На чем играют ПРО | Standoff 2
07:25
Просмотров 274 тыс.
Каха и калЪян
00:46
Просмотров 272 тыс.
🎙ПЕСНИ ВЖИВУЮ от КВАШЕНОЙ🌹
3:09:38
Gaussian Splatting explorations
32:45
Просмотров 24 тыс.
Autoencoders | Deep Learning Animated
11:41
Просмотров 3,4 тыс.
AIAIART Lesson #7 - Diffusion Models
44:44
Просмотров 13 тыс.
Evaluating Diffusion Models with PickScore
14:32
InstructPix2Pix Explained - Edit Images with Words!
13:22