Commit Graph

1773 Commits

Author SHA1 Message Date
Jairo Correa 006b24cc32 Prevent image cache 2023-11-11 15:56:14 -03:00
comfyanonymous 248aa3e563 Fix bug. 2023-11-11 12:20:16 -05:00
comfyanonymous 4a8a839b40 Add option to use in place weight updating in ModelPatcher. 2023-11-11 01:11:12 -05:00
comfyanonymous 412d3ff57d Refactor. 2023-11-11 01:11:06 -05:00
comfyanonymous ca2812bae0 Fix RescaleCFG for batch size > 1. 2023-11-10 22:05:25 -05:00
comfyanonymous 58d5d71a93 Working RescaleCFG node.
This was broken because of recent changes so I fixed it and moved it from
the experiments repo.
2023-11-10 20:52:10 -05:00
comfyanonymous 3e0033ef30 Fix model merge bug.
Unload models before getting weights for model patching.
2023-11-10 03:19:05 -05:00
comfyanonymous 002aefa382 Support lcm models.
Use the "lcm" sampler to sample them, you also have to use the
ModelSamplingDiscrete node to set them as lcm models to use them properly.
2023-11-09 18:30:22 -05:00
comfyanonymous ca71e542d2 Lower cfg step to 0.1 in sampler nodes. 2023-11-09 17:35:17 -05:00
pythongosssss 72e3feb573
Load API JSON (#1932)
* added loading api json

* revert async change

* reorder
2023-11-09 13:33:43 -05:00
comfyanonymous cd6df8b323 Fix sanitize node name removing the "/" character. 2023-11-09 13:10:19 -05:00
comfyanonymous ec12000136 Add support for full diff lora keys. 2023-11-08 22:05:31 -05:00
comfyanonymous 064d7583eb Add a CONDConstant for passing non tensor conds to unet. 2023-11-08 01:59:09 -05:00
comfyanonymous 794dd2064d Fix typo. 2023-11-07 23:41:55 -05:00
comfyanonymous 0a6fd49a3e Print leftover keys when using the UNETLoader. 2023-11-07 22:15:55 -05:00
comfyanonymous fe40109b57 Fix issue with object patches not being copied with patcher. 2023-11-07 22:15:15 -05:00
comfyanonymous a527d0c795 Code refactor. 2023-11-07 19:33:40 -05:00
comfyanonymous 2a23ba0b8c Fix unet ops not entirely on GPU. 2023-11-07 04:30:37 -05:00
comfyanonymous 844dbf97a7 Add: advanced->model->ModelSamplingDiscrete node.
This allows changing the sampling parameters of the model (eps or vpred)
or set the model to use zsnr.
2023-11-07 03:28:53 -05:00
comfyanonymous d07cd44272 Merge branch 'master' of https://github.com/cubiq/ComfyUI 2023-11-07 01:52:13 -05:00
comfyanonymous 656c0b5d90 CLIP code refactor and improvements.
More generic clip model class that can be used on more types of text
encoders.

Don't apply weighting algorithm when weight is 1.0

Don't compute an empty token output when it's not needed.
2023-11-06 14:17:41 -05:00
comfyanonymous b3fcd64c6c Make SDTokenizer class work with more types of tokenizers. 2023-11-06 01:09:18 -05:00
matt3o 4acfc11a80 add difference blend mode 2023-11-05 19:00:23 +01:00
comfyanonymous a6c83b3cd0 Merge branch 'fix_unet_wrapper_function_name' of https://github.com/gameltb/ComfyUI 2023-11-05 12:41:38 -05:00
comfyanonymous 02f062b5b7 Sanitize unknown node types on load to prevent XSS. 2023-11-05 12:29:28 -05:00
gameltb 7e455adc07 fix unet_wrapper_function name in ModelPatcher 2023-11-05 17:11:44 +08:00
comfyanonymous 1ffa8858e7 Move model sampling code to comfy/model_sampling.py 2023-11-04 01:32:23 -04:00
comfyanonymous ae2acfc21b Don't convert Nan to zero.
Converting Nan to zero is a bad idea because it makes it hard to tell when
something went wrong.
2023-11-03 13:13:15 -04:00
comfyanonymous ee74ef5c9e Increase maximum batch size in LatentRebatch. 2023-11-02 13:07:41 -04:00
Matteo Spinelli 6e84a01ecc
Refactor the template manager (#1878)
* add drag-drop to node template manager

* better dnd, save field on change

* actually save templates

---------

Co-authored-by: matt3o <matt3o@gmail.com>
2023-11-02 12:29:57 -04:00
comfyanonymous dd116abfc4 Merge branch 'quantize-dither' of https://github.com/tsone/ComfyUI 2023-11-02 00:57:00 -04:00
comfyanonymous d2e27b48f1 sampler_cfg_function now gets the noisy output as argument again.
This should make things that use sampler_cfg_function behave like before.

Added an input argument for those that want the denoised output.

This means you can calculate the x0 prediction of the model by doing:
(input - cond) for example.
2023-11-01 21:24:08 -04:00
comfyanonymous 2455aaed8a Allow model or clip to be None in load_lora_for_models. 2023-11-01 20:27:20 -04:00
comfyanonymous 45a3df1cde Merge branch 'filter-widgets-crash-fix' of https://github.com/Jantolick/ComfyUI 2023-11-01 20:17:25 -04:00
comfyanonymous ecb80abb58 Allow ModelSamplingDiscrete to be instantiated without a model config. 2023-11-01 19:13:03 -04:00
Joseph Antolick 88410ace9b fix: handle null case for currentNode widgets to prevent scroll error 2023-11-01 16:52:51 -04:00
comfyanonymous e73ec8c4da Not used anymore. 2023-11-01 00:01:30 -04:00
comfyanonymous 111f1b5255 Fix some issues with sampling precision. 2023-10-31 23:49:29 -04:00
comfyanonymous 7c0f255de1 Clean up percent start/end and make controlnets work with sigmas. 2023-10-31 22:14:32 -04:00
comfyanonymous a268a574fa Remove a bunch of useless code.
DDIM is the same as euler with a small difference in the inpaint code.
DDIM uses randn_like but I set a fixed seed instead.

I'm keeping it in because I'm sure if I remove it people are going to
complain.
2023-10-31 18:11:29 -04:00
comfyanonymous 1777b54d02 Sampling code changes.
apply_model in model_base now returns the denoised output.

This means that sampling_function now computes things on the denoised
output instead of the model output. This should make things more consistent
across current and future models.
2023-10-31 17:33:43 -04:00
tsone 23c5d17837 Added Bayer dithering to Quantize node. 2023-10-31 22:22:40 +01:00
comfyanonymous c837a173fa Fix some memory issues in sub quad attention. 2023-10-30 15:30:49 -04:00
comfyanonymous 125b03eead Fix some OOM issues with split attention. 2023-10-30 13:14:11 -04:00
Jedrzej Kosinski 41b07ff8d7 Fix TAESD preview to only decode first latent, instead of all 2023-10-29 13:30:23 -05:00
comfyanonymous a12cc05323 Add --max-upload-size argument, the default is 100MB. 2023-10-29 03:55:46 -04:00
comfyanonymous aac8fc99d6 Cleanup webp import code a bit. 2023-10-28 12:24:50 -04:00
comfyanonymous 2a134bfab9 Fix checkpoint loader with config. 2023-10-27 22:13:55 -04:00
comfyanonymous e60ca6929a SD1 and SD2 clip and tokenizer code is now more similar to the SDXL one. 2023-10-27 15:54:04 -04:00
comfyanonymous 6ec3f12c6e Support SSD1B model and make it easier to support asymmetric unets. 2023-10-27 14:45:15 -04:00