Brilliant work Torin, I will share the news on your amazing integration and tools and grab them from your Patreon. It’s especially impressive that you made it so straightforward with the use of the api service and created and excellent way to create a constant animation between the generated frames. Using these as textures composited into the base color of a PBR texture onto 3D objects also generated by AI would be an interesting way for for this to evolve. What an incredible holiday gift for the community! ❤
Thanks Rob! I'm glad you've been enjoying the tutorials! Yeah, I think it'd be really interesting to use these to generate HDRI maps, or for texture map for a 3D model. I should make an example of applying the image output to a 3d model. Looks like Spline added that into their web editor ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-ma91lA51UJ8.html
Hey Torin, thanks so much for the tutorial and project file! I am having a few issues though and wondering if you can answer a question. So when I open the project the noise objects that are used for altering the img2img function do not have anything in them, and it seems like img2img is required for everything to work. How exactly do you get the noise populated with an image or noise data and running correctly? Thank you!
Torin, I was wondering if you could use this and computerenderer to generate images in HD or 4K resolution in Midjourney, DallE-2 or StableDiffusion Models and if the environment could show a count of how many images have been generated to be aware of the run cost as it accrues?
Hey, Torin! This is absolutely stunning... could this potentially be used in a live setting? For example could I get audio in from Ableton Live and then project the reactive visuals in real-time?
This is fantastic. Could the image generation be done live as well, so that prompts could be entered during a performance rather than pre-recorded? @@blankensmithing
Amazing! Thank you for sharing this. I wonder if it would be possible to make the particles interactive through webcam or Kinect movement.. going to try :)
Thanku so much for this amazing video and also for the link of file. My API component is not working can u plzz tell me is it due to version difference of touch designer if it is so plzz tell me which version u used for this.
Great tutorials, keep up the good work! Is there a way to generate images through this method but with live audio coming from an external device, like a turntable?
Thanks Anderr! Yeah totally, you can use an Audio Device In CHOP instead of a Audio File In. Using that operator you can select your computer's built-in microphone, or if you're able to connect your turntables to your computer through an audio interface you can select your audio interface.
You can just pass in a Movie File in TOP into the component. It's not going to convert them in real-time since it takes some time to process, but every time you generate a new image it'll snag the current frame from the TOP