But I can't share my secrets...! Unfortunately very little processing is actually being done on my mac that I use for these videos. For my setup I am using a remote windows machine on the local network that has a GPU. So all the heavy lifting is happening there. The mac is just a portal to it all.
Hey - if I understand your question correctly, best way I could explain it is in this gif: giphy.com/gifs/CSXFKlBdguJZqMqbJc Basically just drag and drop the saved json file onto the workspace. Let me know if this does the trick!
Thanks for this! I was seeing some examples of videos where generated scenes "morph" into each other. Could you give some pointers on the best methods to achieve that? In A1111 it seems like it is an img2img with Film interpolation method
Ah, well a video of frame interpolation was on the list - you just helped me bump it up in the queue :-). I am thinking what you mean by the morphing feature is essentially prompt traveling/prompt scheduling. I touch on it a bit in this video: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-KIOmQr8_BPE.htmlsi=HhHfn--jzh-v5dg4&t=280 If you fast forward to the 10:36 mark you can see the output of the prompt where the scenes blend into one another. As a reference I am using the Fizz Nodes custom node to allow this to work (github.com/FizzleDorf/ComfyUI_FizzNodes) It would probably be best to do another video on just prompt scheduling techniques to give a little better/controlled results. Hope this helps out!