In this video, the method of applying SDXL-Lightning to the "Make Tile SEGS" upscale and improving the upscale context using IPAdapter is demonstrated. github.com/ltd... github.com/ltd...
You can download from ComfyUI from here: github.com/comfyanonymous/ComfyUI And ComfyUI Impact Pack extension from here: github.com/ltdrdata/ComfyUI-Impact-Pack And ComfyUI Inspire Pack extension from here: github.com/ltdrdata/ComfyUI-Inspire-Pack Workflow (you can drag & drop): github.com/ltdrdata/ComfyUI-extension-tutorials/blob/Main/ComfyUI-Impact-Pack/workflow/MakeTileSEGS_upscale_sdxl-lightning_ipadapter.png
no, meshgraphormer works - it gives the depthmap, you are talking about the controlnet depth model, which is only for SD1.5 (for now), but - the output of meshgraphormer is model agnostic - and we can use it for a regular controlnet (depth models) @@chillsoft
I'm waiting for "Tile For Video". When I apply a video to the person detector and create tile segments, the segments don't move along with the person. Does anyone have a solution to make the segments move?
Good afternoon. Excuse me, can I ask you something? Can you please put in the description of the video next time what you have written in the workflow? Not everyone has good English and it would be cool to understand what and how. Unfortunately the links under the video are giving 404. Thank you for your understanding.
It is important to know at which stage and in what manner the stop occurred. And what is displayed in terminal. If what you're talking about is ComfyUI itself stopping along with a pause, then it's because you ran out of RAM.
@@drltdata when connecting to "ControlnetApplySEGS" node "segs_preprossesor" returns an error "'ControlNetAdvancedWrapper' object has no attribute 'doit_ipadapter' File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 1440, in doit DetailerForEach.do_detail(image, segs, model, clip, vae, guide_size, guide_size_for, max_size, seed, steps, cfg, File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_pack.py", line 249, in do_detail enhanced_image, cnet_pils = core.enhance_detail(cropped_image, model, clip, vae, guide_size, guide_size_for_bbox, max_size, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\core.py", line 228, in enhance_detail model, cnet_pils2 = control_net_wrapper.doit_ipadapter(model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ And controlllite models has error in any case. Even without using "segs_preprossesor". "Type 'ControlLLLiteAdvanced' must be used with Apply Advanced ControlNet 🛂🅐🅒🅝 node (with model_optional passed in); otherwise, it will not work." But in "ControlnetApplySEGS" not to have input "model"