Тёмный

Make AI Ads in Flair.AI and also A1111 - Consistent Objects 

Olivio Sarikas
Подписаться 233 тыс.
Просмотров 42 тыс.
50% 1

Опубликовано:

 

1 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 66   
@OlivioSarikas
@OlivioSarikas Год назад
#### Links from the Video #### Join my Map Bashing Live Stream: ru-vid.commq8Z0CNJjP0 Join my Gaming Channel: www.youtube.com/@multi4g Tweet: twitter.com/mickeyxfriedman/status/1666518047203401736 app.flair.ai/ Bottle Image: www.pexels.com/photo/close-up-shot-of-a-bottle-of-whiskey-11271794/ civitai.com/models/4201/realistic-vision-v20
@Archalternative
@Archalternative Год назад
When are you going to talk about deepfake roops? BRAVO
@追光者-j4o
@追光者-j4o Год назад
I would like to ask how to make the product better integrate into the environment, such as the light and shadow of the product, caustic and so on, which has troubled me for a long time
@jhPampoo
@jhPampoo 10 месяцев назад
Can you make a video on product photography with the talent interact with the product such as a bike, bottle... It may be a challenge but worthy xD
@SeanieinLombok
@SeanieinLombok 10 месяцев назад
this is exactly what i am looking for.
@lithium534
@lithium534 Год назад
This is really good. I was trying different ways to create some ad picture but your workflow is spot on. I do use infill and full picture so that the AI uses the image as reference for lighting and I use an Inpaint model.
@OlivioSarikas
@OlivioSarikas Год назад
Glad it was helpful!
@zephilde
@zephilde Год назад
Hi Olivio! Do you have a technique (or searched for one) to also change the base object reflections and specular to fit the background? Maybe a 2nd pass with img2img or inpaint though... Imagine the product is a water bottle for exemple, it would have some part of the background going into the transparency, and ideally deformed, to be perfectly realistic... A bit like the way here in your exemples, the bottle reflection on floor integrate nicely the object, but this time inside the masked area... the chalenge here is to don't change much the product of course (low denoising etc...)
@gohan2091
@gohan2091 5 месяцев назад
After you generated a background that works, how do you then get the masked object to take on the colours/lighting of the background? Would I have to put it into img2img and set a denoise of 0.4+? maybe restrict the object changing too much with a ControlNet?
@tsmakrakis32
@tsmakrakis32 10 месяцев назад
I am having some trouble rendering the final image and matching the perspective of my object with the environment around it. This is the part that you are not showing (unfortunately) and I'd love to see how you manage to end up with such nice pairing of the object with the bg at the end. Thank you!!!!🤠
@CerebricTech
@CerebricTech 16 дней назад
Can this run on i7 32gbram and 4gbvram???
@steffen_r
@steffen_r Год назад
Great idea. I just tried it and combined it with your latest video on creating SoftEdge Maps for composition porposes and placed a bottle on a table (composition wise it's not optimal, since it is placed in the middle, but it works) - created bottle image/ mask image as described - created a table via text2image - created SoftEdge Map with the table - setup photoshop file for Map/ Positioning stuff 1024*768 - put table in here and the bottle (just for positioning it right on the table) - unfortunately the table was too small - so i did some Generative Fill Magic to get it larger and fill the whole space (works even with the black/ white SoftEdge Map) - export map file and used this as a second ControlNet with Softedge for Positoning Few more steps, but with just two images and some AI magic, you get really cool results.
@jamit63
@jamit63 20 дней назад
can you do the same in comfyui, please
@chuckynorris616
@chuckynorris616 9 месяцев назад
how to get rid of the white outline around the products?
@demischi3018
@demischi3018 Год назад
Great tutorial! But how do I influence the lighting? For example if the lighting of my product does not match the lighting of the background I need to change the product in terms of lighting without destroying the picture.
@alecvallintine5435
@alecvallintine5435 Год назад
I'm also looking for a way to handle this. I've watched several videos where they replace the environment by inpainting, but don't address the fact that the lighting is now inconsistent between the subject and environment. In the example with the bottle on the beach, I was pleased to see that it rendered the bottle's shadow on the sand, but the lighting on the bottle itself never changes to reflect the new environment. I've heard people suggest running the final composite through img2img with a low denoise value to "blend" the image together, but I haven't experimented with that yet so I don't know if that can remedy serious lighting inconsistencies.
@vrynstudios
@vrynstudios 3 месяца назад
wow. Thanks for this wonderful video.
@RealityCheckVR
@RealityCheckVR Год назад
Let's play some games Olivio! 🎮🕹🙌
@jason-sk9oi
@jason-sk9oi Год назад
Tremendous workflow!! Thank you for sharing 🎉
@protasbox
@protasbox Год назад
I used to think about how to do the same. Great guide! Thanks
@TheMellado
@TheMellado Год назад
Me saca de la pagina ☹
@Gh0sty.14
@Gh0sty.14 Год назад
Been trying this out but it will only give me a result of the image I uploaded to inpaint. It completely ignores my prompt. I even tried turning off the controlnet and it still just gives me the object. All settings are the same as you. I don't get it.
@brockpenner1
@brockpenner1 Год назад
If Reddit goes the way of Digg later this month, where's everyone going for their SD stuff?
@NickDrake-f1t
@NickDrake-f1t Год назад
Do you have a comfyui workflow to do something similar?
@ИльяДедов-и4ъ
@ИльяДедов-и4ъ Год назад
Dear Olivio, could you tell me where and how you got the noise level settings in your stable diffusion. I installed it according to your guide, but it does not have such settings. Maybe it's some kind of extension? I hope I won't take too much of your time to answer.
@jeffreyhao1343
@jeffreyhao1343 Год назад
I know how to implement these, including the light and reflect light。
@gerritfellisch4309
@gerritfellisch4309 Год назад
Thank you for that video! I was really looking for something like that.
@diskomiks
@diskomiks Год назад
Ever since the inpainting model has been out i'm wondering when it will be possible to just click on an obkect and create a mask of it inside a1111 instead of having to use a brish or go to a third party app like you did. Is that really that hard fo implement? Controlnet Segment model can be also used for that?
@brandhuetv
@brandhuetv Год назад
great tutorials ever as always, do you have a tutorial on how to install automatic 1111 ? thanks
@Romanetics
@Romanetics Год назад
Olivio thank you very much for your content. You are my Stable Diffusion guru. The technique in another video to use img2img to upscale and add more details to the same picture is gold.
@diskomiks
@diskomiks Год назад
Year workflow gives better results than the Flair AI since their background are almost comoletely unrelated to the subject in terms of lighting. I tried canny for labels with typography and i got terrible glitchy letters - what were yoir settings on the whisky bottle? This looks like a holy grail scenario for AI product phktography as most labels contain at least some writing (unlike the Starbucks example).
@OlivioSarikas
@OlivioSarikas Год назад
The whiskey bottle isn't rendered by the AI at all. That's why i use the mask, so it is rendering everything but the Bottle
@joeyc666
@joeyc666 Год назад
Inpainting + control net for great results! Nice work Olivio :) All the best with your gaming channel :D
@fahabibfahabib2312
@fahabibfahabib2312 Год назад
I got error no image match please help me
@김기선-j5t
@김기선-j5t Год назад
Why do you use canny instead of reference only?
@OlivioSarikas
@OlivioSarikas Год назад
good point. I need to cover that too
@omnius_eacc
@omnius_eacc Год назад
Awesome vid Olivio, thanks!
@OlivioSarikas
@OlivioSarikas Год назад
you are welcome :)
@jragon355
@jragon355 Год назад
I love your videos bro. Thanks for being awesome 🤝
@f4ust85
@f4ust85 Год назад
Classic Stable Diffusion - so much work and in the end you get a goofy lowres image you could have created in 2 minutes in Photoshop.
@Hexrun
@Hexrun Год назад
very informative, thanks
@therookiesplaybook
@therookiesplaybook Год назад
Any tips on getting the item to match the scene better? Grabbing a random item off the web and bringing it into photoshop with the mask, it looks very stuck into scene as opposed to being part of it.
@mahamadkader488
@mahamadkader488 Год назад
I have the same problem.. maybe its the different model that i am using, but his products look like they are part of the picture. mine look like I just cropped them into the picture hahah
@요즘생활취미생활
@요즘생활취미생활 Год назад
감사합니다 :)
@thanksfernuthin
@thanksfernuthin Год назад
Great stuff. I tried just giving backgrounds to characters I created that I liked and it would morph bits of them outside of themselves. Now I know now to stop that. Any time I try to use Adetailer and ControlNets it errors out. I think it's memory but even if I drop the requested image size, no love. Anyone else have this problem?
@mikelaing8001
@mikelaing8001 Год назад
did you get a solution to this?
@thanksfernuthin
@thanksfernuthin Год назад
@@mikelaing8001 I did not. But I haven't played around with it either. Let me know if you figure it out. (Although finding an old comment is basically impossible. 🙂)
@mikelaing8001
@mikelaing8001 Год назад
@@thanksfernuthin will do!
@mikelaing8001
@mikelaing8001 Год назад
@@thanksfernuthin just needed a new checkpoint. I'd not downloaded a photorealistic one and was just using what was preloaed. It's fine for me now
@thanksfernuthin
@thanksfernuthin Год назад
@@mikelaing8001 Which checkpoint are you using? Oh, and how much VRAM do you have? I have 12GB.
@sabotage3d
@sabotage3d Год назад
Latest cool stuff as always!
@OlivioSarikas
@OlivioSarikas Год назад
thank you
@nic-ori
@nic-ori Год назад
Thanks.
@OlivioSarikas
@OlivioSarikas Год назад
You're welcome
@mr.entezaee
@mr.entezaee Год назад
CC off ?
@OlivioSarikas
@OlivioSarikas Год назад
seems to work now :)
@MarkDemarest
@MarkDemarest Год назад
🎉
@OlivioSarikas
@OlivioSarikas Год назад
🔥
@MADVILLAIN669
@MADVILLAIN669 Год назад
This is great but can you explain how to make 'masked' versions of images quickly?
@김기선-j5t
@김기선-j5t Год назад
How about we just use generative fill in Photoshop? It seems that we save more time.
@김기선-j5t
@김기선-j5t Год назад
Please I really want to listen to your opinion. I don't have any intention to harm you. I admire your this video but I just want whether we can find better way
@templeofleila
@templeofleila Год назад
My playground AI account was deactivated today... I'm sad... I feel as if I pushed stable diffusion 1.5 further than it was expected to go (LORA free).. Completely photo realistic impossible apocalyptic ballerina zombie bikini photo shoots.. I still haven't seen anything as graphic or realistic as I was able to generate. It was teetering on obscene, but I'm made it a mission to have a insanely graphic image produced from a very PG family friendly prompt and seed. I do know the difference between my prompts and the majority of others'. I want to share it but the AI art higher-ups obviously don't want the masses to know the hack a couple of srtists and I have been playing with the last week and a half. I think I was the first of us to be kicked off. I hope they'll let me back on just to document my prompt evolution... Does anyone know if they reactivate your account? I'm on Mac and have to figure out how to run 1111 and do this locally without a GPU... Any suggestions?
@alecubudulecu
@alecubudulecu Год назад
No idea about them. Why would they deactivate? You can’t run it locally?
@templeofleila
@templeofleila Год назад
@@alecubudulecu the email from playground AI said my work "teetered on obscenity". I began running it locally. I have a Mac studio M1 max. I am more than pleased with it's performance with A1111. Since then playground AI has reinstated my secondary account as long as I keep it family friendly. I don't have access to the hundred or so awesome apocalyptic zombie ballerina bikini photos.. but hey, I'm still in the game
@blahblahdrugs
@blahblahdrugs Год назад
Any way to alter the lighting on the bottle?
@srij0n316
@srij0n316 Год назад
Is lycoris the same as Lora?
Далее
小路飞嫁祸姐姐搞破坏 #路飞#海贼王
00:45
Video Deep Fake - AI NEWS - Is this TOO much?
10:44
Просмотров 35 тыс.
Relight and Preserve any detail with Stable Diffusion
19:02
BETTER than PROMPTS - The Future of AI Composition
9:23
小路飞嫁祸姐姐搞破坏 #路飞#海贼王
00:45