Тёмный
Alex Villabon
Alex Villabon
Alex Villabon
Подписаться
Hey there, fellow VFX enthusiasts!

I'm Alex, a compositing supervisor who's been in the trenches of film and commercial projects for well over a decade now.
This channel is all about sharing the tricks I've picked up along the way, plus some ideas and experiments I've been tinkering with. Hopefully what I share here helps you become a better artist.

Feel free to comment on any of my videos, Im always eager to learn new ways of achieving the same effect. Thanks for watching!
Комментарии
@wix001HD
@wix001HD 2 дня назад
nice one, what about animated skies? You could use Panoramic render to obtain a sequence of HDRi (also not limited to 2k). Any good workflow to upscale a sequence preserving consistent image?
@alexvillabon
@alexvillabon 2 дня назад
@@wix001HD In theory you should be able to render an animated latlong, I dont see why not. The render itself would be temporally stable but not the comfy output. Depending on the scene you could cheat some interpolation to make it more or less work but it wont be perfect.
@terriwynn4256
@terriwynn4256 2 дня назад
Excellent content! I've been exploring ComfyUI lately and can really see its potential, especially with the detail enhancement capabilities. I'm envisioning its use in CG animation - rendering basic scenes, adding details with ComfyUI, then using Copycat to finalize the shot. While the setup is complex and there's a lot to learn, the possibilities are exciting. I hope you continue making videos on this topic
@alexvillabon
@alexvillabon 2 дня назад
@@terriwynn4256 absolutely! The things you can do constantly blow my mind. There is so much to learn though. I’ll share my findings as I go :)
@alexvillabon
@alexvillabon 2 дня назад
I was having a conversation with someone on reddit about this approach and they shared something that I think is a very good idea, so I figured I'd share it here. If you wanted to use this for some sort of lighting or light contribution, instead of match grading which only takes you so far, you can use the original hdr from unreal to light and the output from comfyui as your specular to get those higher detail reflections.
@balajiudhaya
@balajiudhaya 2 дня назад
👍👍👍
@behrampatel3563
@behrampatel3563 5 дней назад
Love these videos. Subscribed. In comfy use the shortcut Ctrl + Enter to start processing. In UE you can bring your image plane and attach it to the camera using the image plate plugin. Cheers, b
@alexvillabon
@alexvillabon 4 дня назад
Hey, thanks! I knew about the comfyui shortcut but dont use so people know when I start processing a new image. I didnt know about the image plate plugin, thanks for the tip!
@behrampatel3563
@behrampatel3563 5 дней назад
Neat video. Lol pardon the pun. I just saw your other video on normal maps as well. Im curious, What is copycat doing differently that comfy ui cannot produce ? Cheers, b
@alexvillabon
@alexvillabon 5 дней назад
Ha! Comfy doesnt produce temporally stable results. Thats where copycat does its magic
@AhmedKhaledFX
@AhmedKhaledFX 8 дней назад
Really happy to see this! Cattery tools and copycat are really underestimated and the video title says it all :) Thanks Alex
@samueljrgensen417
@samueljrgensen417 9 дней назад
fantastic tutorial as always mate :)
@Osvaldsson
@Osvaldsson 9 дней назад
You’re learning from a comp legend people! ❤
@ocdvfx
@ocdvfx 9 дней назад
I love ur channel, keep up the great work
@ocdvfx
@ocdvfx 7 дней назад
I must have installed something wrong, I am getting faces in my clouds and the edges are also turning into tree roots lolol. Or perhaps i just need to adjust my parameters?
@AhmedKhaledFX
@AhmedKhaledFX 9 дней назад
Sweet! 👏 thanks for sharing this
@bellmannmedia
@bellmannmedia 10 дней назад
Thanks for that comparison. I Was wondering how nuke would solve the Problem nowadays. I used it last time ten years before. 👍
@KrunoslavStifter
@KrunoslavStifter 13 дней назад
Cool appraoch you demonstrated. Thanks. It gives me ideas. In fusion I tried to use their relight AI normal map generator which is very low resolution but stable over time, and Fusion Create Normal Map from luminescence which gives details and its also stable. Than there is a macro to combine the two normal and than use shader node or relight node in different setting for relighting. Its relatively faster, faster than this method, but not as good. There is someone who used SwitchLightAI from Bible as script in Fusion for relighting, again not too impressed because often the problem is lack of materials for skin etc. so you end up with either plastic skin looking effect or with not much highlights where they need to be. I've also tried creating something that resembles depth mask with frequency separation methods and than displacing that on a 3D Plane and religthing it in 3D. Works good on some objects, but people are not that great. In the end, as you have commented on, I've found to get decent results by simply suing various light wrap combinations. It is fastest and produces relatively speaking best relighting results ,rather than all the huppla with AI. But what I am missing in fusion is something like copypat node to train it over time to do a custom relight job. This is about where I am at the moment with my methods in fusion: Flying Robot Relight Test In Resolve Fusion ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-iiE2f5rSy1I.html With Copypat that nuke has, I think it could be a lot better. Cool appraoch you demonstrated. Thanks. It gives me ideas. Cheers!
@RafaelCasagrandeav
@RafaelCasagrandeav 14 дней назад
this is great, thank you for sharing
@reed4109
@reed4109 15 дней назад
Thanks for taking the time to make these awesome videos
@ocdvfx
@ocdvfx 19 дней назад
thank you kind sir
@madlookzvfx
@madlookzvfx 22 дня назад
Keep doing these tutorials. Really useful!Thanks!
@juancamilo908
@juancamilo908 22 дня назад
Great video, Can we use the cattery nodes as a pre trained model and keep teaching it to get better results?
@alexvillabon
@alexvillabon 22 дня назад
In theory yes. However, I've yet to meet anyone that made it work. I read somewhere there is a big cattery update coming soon with some new models for us to use. I hope that update touches on other custom pretrained models. Keep in mind that copycat has some pretrained models which im going to assume are cattery models. You can select from the ones available inside the node and keep training from there so you could have your own "improved" one as an inference node.
@ss_websurfer
@ss_websurfer 22 дня назад
hey I am not able to see the option "manager" on my ComfyUI any idea why?
@alexvillabon
@alexvillabon 22 дня назад
You can get it from here: github.com/ltdrdata/ComfyUI-Manager
@ss_websurfer
@ss_websurfer 22 дня назад
@@alexvillabon Hey I got that now, but now it is giving me the error that I am missing a git extension
@alexvillabon
@alexvillabon 22 дня назад
I'd refer you to the comfyui reddit - there are some really helpful people there. Im sure if you search there you will find the answer. Alternatively watch one of the countless comfyui installation videos on youtube.
@Danielsam55
@Danielsam55 22 дня назад
Very useful videos, congrats on the channel
@moenninger
@moenninger 23 дня назад
Hi Alex, Thanks for sharing. I use a different workaround, maybe you will find it interesting. If you use a Constant with a Transform as the BG input, the BBox from scaling the Constant via the Transform node survives the Rayrender node and provides the overscan. Best, Tobi
@alexvillabon
@alexvillabon 23 дня назад
That's a really neat trick! Thanks for sharing. This is what I love about comping/compers there are so many ways of achieving the same result.
@gadass
@gadass 23 дня назад
New best youtube channel. Thanks for doing the videos. It's really pleasing to watch :)
@madlookzvfx
@madlookzvfx 28 дней назад
Brilliant!
@simpernchong
@simpernchong 29 дней назад
Genius
@jtsanborn1324
@jtsanborn1324 29 дней назад
The only downside with SUPIR is that it is really slow to get the final output and takes a lot of vram, so if you are not with a 24gb gpu as minimun you wont be able to use it. So it makes it really unsuable for video/image sequences. Other than that is pretty awesome what it gives.
@harrybardak516
@harrybardak516 Месяц назад
Did you manage to retrieve the details of the normal map? I never could even after a lot more epoch. ( I have pushed it to 1 Millions)
@WindfallWealth
@WindfallWealth Месяц назад
This is very interesting! I feel like this could also be used in a way to fix flicker much better than Davinci Resolve's deflicker? Say you have source footage but there is some jitter in your stylized footage between frames - could you train it in a similar fashion?
@alexvillabon
@alexvillabon Месяц назад
Potentially. Im looking into relighting moving shots with comfy (without normals) at the moment. But as always, temporal consistency is the killer. There are other ways of mitigating the chatter/noise along with copycat. If I get a decent result, ill share my findings.
@johnloughlin6804
@johnloughlin6804 Месяц назад
going to follow closer!, what Nuke add on are you using the get those extra 'hotkeys'(?) frame 20:24 thanks
@alexvillabon
@alexvillabon Месяц назад
That’s w_hotbox. Been using it for years, cant live without it. You can download it here: www.nukepedia.com/python/ui/w_hotbox
@BrandspankingFilm
@BrandspankingFilm Месяц назад
Awesome stuff ❤
@Naruto_uzumaki7979
@Naruto_uzumaki7979 Месяц назад
Keep posting this kind of advance informative videos