Тёмный

Control Light in AI Images 

Sebastian Kamph
Подписаться 152 тыс.
Просмотров 104 тыс.
50% 1

Опубликовано:

 

31 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 298   
@sebastiankamph
@sebastiankamph Год назад
The Prompt styles I use here: www.patreon.com/posts/sebs-hilis-79649068
@Mocorn
@Mocorn Год назад
For anyone coming back to this tutorial like me a couple months late to the party. You need to lower the denoise value to between 0.6 and 0.7 to get the desired effects now. Controlnet and A1111 has had updates so the values have changed slightly. This still works really well though!
@zengrath
@zengrath Год назад
Never imagined being able to control light that well in stable diffusion. Honestly I would have only expected that level of light control in 3d applications like Blender, not when manipulating 2d images, just nutz what AI is doing
@christiandarkin
@christiandarkin Год назад
it'd be fairly easy to create some animated light patterns using after effects or blender, then save them out as frames. You should be able to use batch to automate the process of making the images.
@georgeaiken2965
@georgeaiken2965 Год назад
@@Warzak77 that subreddit is where this guy gets his video ideas from
@ColoNihilism
@ColoNihilism Год назад
@@georgeaiken2965 wabalaba sub sub
@Nawaf511-t1z
@Nawaf511-t1z Год назад
this replay by AI agents : Let's get creative and whip up some dazzling light patterns using the powerful tools of After Effects or Blender! With just a few clicks, we can generate animated sequences that will leave your audience mesmerized. And don't worry about tedious or repetitive tasks - we can streamline the process with the Batch tool to save time and effort. So let's get to work and create some stunning visuals that will leave a lasting impression!
@ysy69
@ysy69 Год назад
All the tools being introduced with Controlnet is simply mind blowing. Wow. Thanks so much for these videos
@sebastiankamph
@sebastiankamph Год назад
You bet, superstar!
@prymestudio
@prymestudio Год назад
Hey man, me again, just want to say thank you. I got into stable diffusion because of you, and your process has greatly improved my art. Thank you.
@sebastiankamph
@sebastiankamph Год назад
Happy to hear it! Good to have you here buddy 😊💯💫
@SwiteFilms
@SwiteFilms Год назад
5:44 You may also animate that moving frame in a video software and export it as a png sequence so you wont have to move it nastily
@sebastiankamph
@sebastiankamph Год назад
Smart!
@juillotine
@juillotine Год назад
And you can also automate the generations with comfyui so you end up with a sequence of frames in the end
@ToniCorvera
@ToniCorvera Год назад
Mind blown by the amount of control we've got over outputs. Again. And again.
@jibcot8541
@jibcot8541 Год назад
This is amazing! I have struggled the most with lighting even with SDXL models.
@a_vin7644
@a_vin7644 8 месяцев назад
Many tnx. Great idea. I fast tested it in Playground AI. I can create in Blender or Photoshop images and use it for prompt.
@aayush_dutt
@aayush_dutt Год назад
Mind blowing! We're just seeing the beginning of what control net can do. A whole lot many insane features will keep pouring in!
@sebastiankamph
@sebastiankamph Год назад
Yep, SD just took leaps beyond all others.
@mgkannan749
@mgkannan749 Год назад
Cheers for this awesome post .I finally got an excellent result will this lovely method when I tried it.I am enjoying it now
@kernsanders3973
@kernsanders3973 Год назад
Thank you, been trying to nail certain lighting, this will help a lot
@juspar
@juspar Год назад
Wow, this effect, and your tutorial are great! I've watched lost of your stuff and this is one of the best!
@sebastiankamph
@sebastiankamph Год назад
Wow, thanks! 😊🌟
@MarkArandjus
@MarkArandjus Год назад
The video example shows that this isn't like photoshop when you put a bright element on overlay or linear dodge blending mode on a top layer, it actually generates appropriate shadows.
@michaelsnow7440
@michaelsnow7440 Год назад
There are a lot of green screen video effects that are normally used in videos, but you can extract the frames and use them here. Things that are like moving lights, electric crackles, explosions, things like that.
@sebastiankamph
@sebastiankamph Год назад
Clever!
@SteveWarner
@SteveWarner Год назад
Smart. Love this. It really shows how a creative implementation of SD + Control Net can go way beyond just consistent poses. Keep up the great work!
@sebastiankamph
@sebastiankamph Год назад
Thank you 😁
@shadowdemonaer
@shadowdemonaer Год назад
Now, I just need a version for anime :') Honestly, this is really cool either way because you can now use this on any image to change the lighting of the image and be able to have exactly the kind of reference for light that you needed in your scene while drawing and I think that is super cool.
@devnull_
@devnull_ Год назад
Pretty much like relighting in 3D software, but with SD. Really neat.
@sebastiankamph
@sebastiankamph Год назад
I'm sure we'll see more implementations from other apps soon.
@OriBengal
@OriBengal Год назад
ooh! Thanks! I was trying to prompt a starburst of light behind a character... This solves that!
@sebastiankamph
@sebastiankamph Год назад
Glad I could help!
@m_art_ucci
@m_art_ucci Год назад
You can use high contrast to make hard shadows, just like in photography. Or even something like a shadow in the shape you want in front or in the back of something else.
@lxic-bz8hf
@lxic-bz8hf Год назад
amazing, thank you for the light pack 🙏
@samueleroncaglia
@samueleroncaglia Год назад
Hi Sebastian and thanks for all your video and explanations. This is a very intruguing trick. A similar process I saw in a post-production facility, used the normal map channel to relight the scene. It should have a more mathematical approach to calculating lighting then the depht cannel (which is also fine for our poruposes), and maybe open the way to a more PBR-like workflow. I'm doing some tests, but I don't see great improvements, even switching pre-post processing or using colored lights. To animate lights you can make a simple animation moving a gradient over a black solid in PhotoShop (turn animation panel on), or AferFX, export a frame sequence, and use it as input for batch procecess in Automatic1111 Let me know if you want to do some tests and if you find this advice interesting, Cheers!
@sebastiankamph
@sebastiankamph Год назад
Thanks! Ah, very cool. If you decide to test it, please share your results in our Discord. I know Maui would be interested to test more in detail for sure, and me too.
@michealkinney6205
@michealkinney6205 Год назад
I'm binging all your videos now, awesome work and great detailed explanation of the options available. Thanks, you definitely deserve my sub!
@sebastiankamph
@sebastiankamph Год назад
Glad you like them! Good to have you on board 😊💫
@wolfai_
@wolfai_ Год назад
Thank you for the tricks! I control light manually with prompts usually. This is going to be fun
@sebastiankamph
@sebastiankamph Год назад
I bet it will save you some time 😊
@Modioman69
@Modioman69 Год назад
Amazing content once again you nailed it with a very good lighting guide utilizing the amazing control net. I’ll be using this for all my photos now as lighting is a game changer when you can control it.
@ColoNihilism
@ColoNihilism Год назад
awesomeness! 10x for the vid + light pack !
@BabySharkSongs
@BabySharkSongs Год назад
nice, but i do not have pen to crop the light... where i can use that ? Thanks
@voEovove
@voEovove Год назад
What's even more amazing, is that the creator of ControlNet had a similar AI that did this years ago. Pretty amazing.
@ovalshrimp
@ovalshrimp Год назад
Incredible. Can’t wait to play with this. Thanks Sebastian.
@sebastiankamph
@sebastiankamph Год назад
Have fun! 😊💫
@Elwaves2925
@Elwaves2925 Год назад
So it's kinda like an HDRI image but layered on top of the AI image instead of surrounding a 3D scene. Nice.
@joeyc666
@joeyc666 Год назад
Great technique, Sebastian. And great jokes too ♥
@sebastiankamph
@sebastiankamph Год назад
Glad you enjoyed it! 💫
@uk3dcom
@uk3dcom Год назад
Very neat. Now if we could just have subjects on layers for easy comping. You know foreground background with these types of control, imagine the posabilities.
@sebastiankamph
@sebastiankamph Год назад
Check my vid on Multi-ControlNet. It's aaalmost like that.
@imperium35769
@imperium35769 Год назад
The joke in the beginning was honestly top tier
@notDreadful
@notDreadful Год назад
Absolutely amazing
@sebastiankamph
@sebastiankamph Год назад
ControlNet really changed the game!
@Dre2Dee2
@Dre2Dee2 Год назад
MY MIND IS BLOWN. GAME CHANGER
@sebastiankamph
@sebastiankamph Год назад
😂 Glad you liked it!
@pokis50
@pokis50 Год назад
Hey! Some ideas for videos that are of interest to me, if you think they're worth, maybe you'll cover it: 1. How do you pick your models, I constantly see new models being used, how do you track them, pick them etc. I have a lot, but It's always hard to know "what's the latest edge cutting one" 2. About different types of models that can be used, in the beginning it was overwhelming with control net, LORA's, scripts etc. Each of them requiring different setups and purposes. A summary or some detailed intro would be really beneficial. Anyway, love your videos, keep them coming! Sending all the
@sebastiankamph
@sebastiankamph Год назад
Thank you for your thoughtful comment and your suggestions, as well as your support 😘
@jonathaningram8157
@jonathaningram8157 Год назад
There is so many emergent techniques coming out for IA generation it's so exciting. A bit complicated to understand everything but it's like future in the making.
@sebastiankamph
@sebastiankamph Год назад
Yep! And when applications get more streamlined it will be easier for all 🤩
@Dallimar
@Dallimar Год назад
Question; I'm trying to change the size of the selection as shown around 2:06, but it keeps cropping the source image instead of just resizing the selection grid. How do I fix that?
@jubb1984
@jubb1984 Год назад
Another wonderful tutorial, thanks for the light pack as well! Shows well how to make my own now =)
@sebastiankamph
@sebastiankamph Год назад
Happy to help! Hope you can create some amazing art now 😊
@jessedart9103
@jessedart9103 Год назад
Excellent tutorial. Thank you.
@sebastiankamph
@sebastiankamph Год назад
Glad you enjoyed it!
@KDawg5000
@KDawg5000 Год назад
Heres Video #2 (actually my first test). Note I used Blender to make the lights animated. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-iM5KoIJ5HvE.html
@dwoodwoo65
@dwoodwoo65 Год назад
your dad joke always gets a reaction from me. Don't stop, but don't quit your day job 🙂
@RobertWildling
@RobertWildling 8 месяцев назад
I love your jokes! Subscribed because of them!
@sebastiankamph
@sebastiankamph 8 месяцев назад
Thank you, and welcome aboard! 😁
@coda514
@coda514 Год назад
This is awesome. Thanks for sharing your knowledge.
@sebastiankamph
@sebastiankamph Год назад
My pleasure! 😊
@adamvlux
@adamvlux Год назад
Best pun so far in this one
@RayAtSSdR
@RayAtSSdR Год назад
Great trick! So simple.
@eugeniavorontsova2796
@eugeniavorontsova2796 Год назад
stunning technique!! thank you for sharing!!!!!
@sebastiankamph
@sebastiankamph Год назад
Thank you for being a part of this journey, good to have you here 😊
@ASeale74
@ASeale74 Год назад
wow control net is amazing!
@Octamed
@Octamed Год назад
It'd use Unreal or Unity, with volumetric lighting on. Animate your lights and render out. That would be very controlable.
@MrImmortal709
@MrImmortal709 Год назад
Thank you a lot for these very helpful videos.
@KkommA88
@KkommA88 Год назад
Freaking awesome, thanks!
@sebastiankamph
@sebastiankamph Год назад
You bet!
@dovakinlink
@dovakinlink Год назад
Thank you for your amazing tutorial!
@sebastiankamph
@sebastiankamph Год назад
You're very welcome!
@Naundob
@Naundob Год назад
Wow! Cant wrap my head around how the character stays that consistent to the tiniest detail. How does that work? 🤯
@sebastiankamph
@sebastiankamph Год назад
ControlNet. It is the absolute best extension to Stable diffusion right now. And brand new. See my previous videos! 😊
@Naundob
@Naundob Год назад
@@sebastiankamph Right, it’s mindblowing. But when I pipe different light images into the process I would expect that it wouldn’t only alter the lighting of the result but also the shapes… even a tiny little bit. But in your video every hair strand seams to stay rock solid how and where it was before. Simply astonishing.
@j.j.maverick9252
@j.j.maverick9252 Год назад
definitely check the other videos, but my understanding of how this works is that we’re using the image in control net and extracting a depth map from it. That image is always the same, so the depth locks down many of the details. The light image is being used as the source, and that high noise value allows controlnet to apply strong changes to the light image… using both the depth map and the pixels from the image of the woman. With both the depth and the pixels, and just the additional light cues, SD is pretty much forced into consistency. Apologies if any of that is incorrect, I’ve been experimenting with all the settings but there’s still a lot to learn!
@DarioToledo
@DarioToledo Год назад
I'm looking for a way to copy lighting from another image, similar to what Beeble does. Any hints on how to accompligh it?
@andrewstraeker4020
@andrewstraeker4020 Год назад
Really interesting and useful tips 😺 Excellently explained. Thank You 👍 I can't wait to go and try 😸
@sebastiankamph
@sebastiankamph Год назад
Glad you enjoyed it, you absolute rockstar you 🤩
@Finofinissimo
@Finofinissimo Год назад
amazing routine, dude.
@sebastiankamph
@sebastiankamph Год назад
Glad you liked it! 😊
@허진석-t9p
@허진석-t9p Год назад
감사합니다
@Hornet_Labs
@Hornet_Labs Год назад
Sebastian, thank for this! Can I ask, how do you crop that way? For me, when I crop in img2img it crops immediately after I let go. Yours seems to let you crop in and out again.
@ulriksommer
@ulriksommer Год назад
Same for me. What am I missing?
@richarballesteros5442
@richarballesteros5442 Год назад
That was so creative 👏
@Nutronic
@Nutronic Год назад
Brilliant vid. Thanks.
@swannschilling474
@swannschilling474 Год назад
That is a great little trick!! 🤩
@BlankFX
@BlankFX Год назад
Thanks for this Video, but I have quite some issues with it. Even though I re-checked with your guide multiple times, the composition of my image changes way too much from its original. I don't think this has much to do with my prompt not perfectly fitting to my used image as it was created using inpainting in the progress, does it?
@gregoryscaux
@gregoryscaux Год назад
Hey, have you had any luck about that issue? I've encountered the same problem: my ending result is nowhere as close as the original image...
@resonantone3284
@resonantone3284 Год назад
Your dad jokes just kill me. Thanks once again for some awesome information!
@sebastiankamph
@sebastiankamph Год назад
Hah, glad you're enjoying the vids! 😊
@Tigermania
@Tigermania Год назад
Very informative cheers :)
@jaharoni
@jaharoni Год назад
Wow! This is really crazy. An artist by the name Jeremy Cowart created a method of doing this in camera. I need to work this into some edits and lose my mind
@sebastiankamph
@sebastiankamph Год назад
Go for it! It's wild 🤩💫
@FunwithBlender
@FunwithBlender Год назад
great video bud
@kylewang6704
@kylewang6704 Год назад
Nice tutorial. Is there a way to change lighting condition in a real photo rather than AI generated one?
@sebastiankamph
@sebastiankamph Год назад
Not that I know of. This is dependant on using a generated image.
@kpwkpw4989
@kpwkpw4989 Год назад
Amazing! Thank you for your videos. You are great!
@sebastiankamph
@sebastiankamph Год назад
Aw, thank you for the kind words 😊
@sazarod
@sazarod Год назад
Great trick ! Thanks
@sebastiankamph
@sebastiankamph Год назад
Happy to help! 😊
@TheMrawesomest
@TheMrawesomest Год назад
Very interesting use. The rendering was pretty snappy. What graphics card are you using?
@sebastiankamph
@sebastiankamph Год назад
Rtx 3080. I speed some parts up
@TheMaxvin
@TheMaxvin 11 месяцев назад
In A1111, img2img when placing a picture in the right corner is only a cross, but there should be a pencil to edit the focus grid. How to treat it?
@JohnVanderbeck
@JohnVanderbeck Год назад
Problem I have with this is it basically completely redraws a new image, rather than just changing the lighting in an existing image. Very frustrating to get a really solid image but you just don't like the lighting, and I can't find any good ways to change that, at least not without falling back to traditional tools.
@ethnobeat-u4n
@ethnobeat-u4n Год назад
clever guy, thanx
@ofulgor
@ofulgor Год назад
Wow.... Just wow.
@sebastiankamph
@sebastiankamph Год назад
You said it best 💥
@benjamininkorea7016
@benjamininkorea7016 Год назад
My holy grail is re-lighting real-life faces to match them with changes in lighting and environment. I think maybe I have to do a full-face detailed close-up HED or something. Have you acheived this yet?
@sebastiankamph
@sebastiankamph Год назад
I'd love to see it when you get it done!
@fernandomasotto
@fernandomasotto Год назад
Thank you very much!!!
@sebastiankamph
@sebastiankamph Год назад
You're welcome! 😊
@Dewclaws
@Dewclaws Год назад
@Sebastian Nice tutorial vidio as usual, thanks for posting so much. Hoping you might be able to share you initial prompt you had for the character you worked on.
@sebastiankamph
@sebastiankamph Год назад
Ask me in Discord
@mariusvandenberg4250
@mariusvandenberg4250 Год назад
Thank you Sebastian fantastic video as always. my IMG2IMG resize/edit tool does not act the the same as your, When i select an area, that selection then becomes the new image that fills the IMG2IMG space. in your case the selected area remains at the same plac e and you can move the selection or tweak it.
@chove93
@chove93 Год назад
I have the same problem have you found a solution?
@chrisdixonstudios
@chrisdixonstudios Год назад
Ok, sooo use light to animate depth maps around subject posed as in photogammetry. Rinse lather repeat script and bake, viola: 3D model movie!
@HikingWithCooper
@HikingWithCooper Год назад
This ControlNet changes AI art.
@jacintduduka9137
@jacintduduka9137 Год назад
Any idea how this can be implemented using controlnet in comfyui?
@simile20
@simile20 Год назад
It looks cool and promising. But. How do we make this work on complex scenes? I tried it with woman in black dress and this method just brighten the outfit (or part of it).
@LOVEHONEY28
@LOVEHONEY28 Год назад
Amazing! THX
@sebastiankamph
@sebastiankamph Год назад
Most welcome!
@azmodel
@azmodel Год назад
This is genious
@xxxyz721
@xxxyz721 Год назад
can you simply use a pre-existing image then apply the lighting through the control net? or do you need to use the control net as the "source" image?
@SunilWilliams
@SunilWilliams Год назад
Maybe controlNet has changed since this process was outlined. If I follow this process, the outcome is not an image with altered lighting, but a depthmap of a combination of the original image and the lighting image. This is what we should anyway when using controlNet to create depthmaps. I don't see how this process should have worked in the first place, and it doesn't seem to work presently.
@sebastiankamph
@sebastiankamph Год назад
The light goes in img2img, not ControlNet, hence it doesn't mess with the depthmap. Still works.
@GunwoongMoon
@GunwoongMoon Год назад
I've tried a couple of times, but the people and backgrounds keep changing a bit, even when I fix the seeds and prompts. Is there something I need to do differently, like install more extensions?
@jeffreysabino6176
@jeffreysabino6176 Год назад
I'm wondering this too
@seankwon3090
@seankwon3090 Год назад
Amazing works thanks! However, I get grey monotone image whenever I do this. It does grey monotone process at the end of the step. Is there any way to fix this?
@charliejiang6457
@charliejiang6457 Год назад
Is there any detail about Maui's consistent face relighting video? I'm rather interested in how to keep the exact face unchanged🤣
@raijin5280
@raijin5280 10 месяцев назад
Hi, I am not seeing the edit button on my img2img. Is there an extension I need to download?
@JackBernard-y8l
@JackBernard-y8l Год назад
Sebastian, can I ask you whether when you render you speed up the video, or whether you're usng a grafics card in your computer? If so what card is it. I use Google Colab and it's not as fast as your renders by far... thanks
@sebastiankamph
@sebastiankamph Год назад
RTX 3080, but I speed up the renders most of the time.
@gkoriski
@gkoriski Год назад
Amazing😶‍🌫
@sebastiankamph
@sebastiankamph Год назад
Thanks 🔥
@kuromiLayfe
@kuromiLayfe Год назад
This is amazing ..too bad for some reason controlnet OoM’s on my system with purged models and lowvram enabled .. so gonna have to wait till I upgraded my gpu to test this
@filmmaster001
@filmmaster001 Год назад
Nice one @Seb! Any thoughts on updating this due to the new ControlNet and A1111 release?
@sebastiankamph
@sebastiankamph Год назад
Thank you! Honestly, I haven't been using it for quite some time. If I get time I might revisit and see if the changes warrant an update. I got a few other ones first on the list.
@filmmaster001
@filmmaster001 Год назад
@@sebastiankamph The reason why I ask, is that the workflow above, doesnt work with 1.1, and I am desperate to get the workflow running. Happy to pay consulting fee.
@filmmaster001
@filmmaster001 Год назад
@@sebastiankamph Sent you an email :)
@Ekkivok
@Ekkivok Год назад
not working it destruct all the original image, and i followed every steps, and it create another one..
@herval
@herval Год назад
Surreal! Does this also work well with actual photos, for instance?
@sebastiankamph
@sebastiankamph Год назад
I don't think so. It relies on the generation. You could save the composition from a photo though.
@KDawg5000
@KDawg5000 Год назад
@@sebastiankamph I got it to work with regular portrait photos of people I know, but I used a 2nd controlnet (hed). You lose some of the resemblance of the person, but if you overlay the original image at say 25% opacity, then it looks good again.
@madshader
@madshader Год назад
Do you have a tutorial on how you made your lightpack images?
@gszeman
@gszeman Год назад
I love your videos! So informative and detailed. Thank you! PROBLEM: When I try to generate after selecting the lighting png, it starts generating, I can see my image and the lighting effects start to form, and then it just stops, without actually generating an image?🤔
@pastuh
@pastuh Год назад
I think simple javascript could help prepare shadow and automate rendering.
Далее
WOW! NEW ControlNet feature DESTROYS competition!
9:08
Quinn did NOT expect this one - ESL Dota 2
00:34
Просмотров 292 тыс.
GREAT Outpainting with ControlNet!
10:11
Просмотров 99 тыс.
Revealing my Workflow to Perfect AI Images.
13:31
Просмотров 322 тыс.
NEW: Lighting ControNET Method - THIS IS AWESOME!!!!
10:50
10 AI Animation Tools You Won’t Believe are Free
16:02
Stable Diffusion + ComfyUI
41:12
Просмотров 59 тыс.
NEW Prompt Cheat Sheet
5:13
Просмотров 67 тыс.
Stable Diffusion IMG2IMG HACKS You need to TRY!
6:15
Paint&Text2Image - MultiDiffusion Region Control.
11:23