Jianqi Pan
f2e49b1d57
fix: adaptation to older versions of pytroch
2023-11-14 14:32:05 +09:00
comfyanonymous
94cc718e9c
Add a way to add patches to the input block.
2023-11-14 00:08:12 -05:00
comfyanonymous
8509bd58b4
Reorganize custom_sampling nodes.
2023-11-13 21:45:23 -05:00
comfyanonymous
61112c81b9
Add a node to flip the sigmas for unsampling.
2023-11-13 21:45:08 -05:00
comfyanonymous
eb0407e806
Update litegraph to latest.
2023-11-13 16:26:28 -05:00
comfyanonymous
7339479b10
Disable xformers when it can't load properly.
2023-11-13 12:31:10 -05:00
comfyanonymous
f12ec55983
Allow boolean widgets to have no options dict.
2023-11-13 00:42:34 -05:00
pythongosssss
4aeef781a3
Support number/text ids when importing API JSON ( #1952 )
...
* support numeric/text ids
2023-11-12 14:49:23 -05:00
comfyanonymous
4781819a85
Make memory estimation aware of model dtype.
2023-11-12 04:28:26 -05:00
comfyanonymous
dd4ba68b6e
Allow different models to estimate memory usage differently.
2023-11-12 04:03:52 -05:00
comfyanonymous
2c9dba8dc0
sampling_function now has the model object as the argument.
2023-11-12 03:45:10 -05:00
comfyanonymous
8d80584f6a
Remove useless argument from uni_pc sampler.
2023-11-12 01:25:33 -05:00
Jairo Correa
006b24cc32
Prevent image cache
2023-11-11 15:56:14 -03:00
comfyanonymous
248aa3e563
Fix bug.
2023-11-11 12:20:16 -05:00
comfyanonymous
4a8a839b40
Add option to use in place weight updating in ModelPatcher.
2023-11-11 01:11:12 -05:00
comfyanonymous
412d3ff57d
Refactor.
2023-11-11 01:11:06 -05:00
comfyanonymous
ca2812bae0
Fix RescaleCFG for batch size > 1.
2023-11-10 22:05:25 -05:00
comfyanonymous
58d5d71a93
Working RescaleCFG node.
...
This was broken because of recent changes so I fixed it and moved it from
the experiments repo.
2023-11-10 20:52:10 -05:00
comfyanonymous
3e0033ef30
Fix model merge bug.
...
Unload models before getting weights for model patching.
2023-11-10 03:19:05 -05:00
comfyanonymous
002aefa382
Support lcm models.
...
Use the "lcm" sampler to sample them, you also have to use the
ModelSamplingDiscrete node to set them as lcm models to use them properly.
2023-11-09 18:30:22 -05:00
comfyanonymous
ca71e542d2
Lower cfg step to 0.1 in sampler nodes.
2023-11-09 17:35:17 -05:00
pythongosssss
72e3feb573
Load API JSON ( #1932 )
...
* added loading api json
* revert async change
* reorder
2023-11-09 13:33:43 -05:00
comfyanonymous
cd6df8b323
Fix sanitize node name removing the "/" character.
2023-11-09 13:10:19 -05:00
comfyanonymous
ec12000136
Add support for full diff lora keys.
2023-11-08 22:05:31 -05:00
comfyanonymous
064d7583eb
Add a CONDConstant for passing non tensor conds to unet.
2023-11-08 01:59:09 -05:00
comfyanonymous
794dd2064d
Fix typo.
2023-11-07 23:41:55 -05:00
comfyanonymous
0a6fd49a3e
Print leftover keys when using the UNETLoader.
2023-11-07 22:15:55 -05:00
comfyanonymous
fe40109b57
Fix issue with object patches not being copied with patcher.
2023-11-07 22:15:15 -05:00
comfyanonymous
a527d0c795
Code refactor.
2023-11-07 19:33:40 -05:00
comfyanonymous
2a23ba0b8c
Fix unet ops not entirely on GPU.
2023-11-07 04:30:37 -05:00
comfyanonymous
844dbf97a7
Add: advanced->model->ModelSamplingDiscrete node.
...
This allows changing the sampling parameters of the model (eps or vpred)
or set the model to use zsnr.
2023-11-07 03:28:53 -05:00
comfyanonymous
d07cd44272
Merge branch 'master' of https://github.com/cubiq/ComfyUI
2023-11-07 01:52:13 -05:00
comfyanonymous
656c0b5d90
CLIP code refactor and improvements.
...
More generic clip model class that can be used on more types of text
encoders.
Don't apply weighting algorithm when weight is 1.0
Don't compute an empty token output when it's not needed.
2023-11-06 14:17:41 -05:00
comfyanonymous
b3fcd64c6c
Make SDTokenizer class work with more types of tokenizers.
2023-11-06 01:09:18 -05:00
matt3o
4acfc11a80
add difference blend mode
2023-11-05 19:00:23 +01:00
comfyanonymous
a6c83b3cd0
Merge branch 'fix_unet_wrapper_function_name' of https://github.com/gameltb/ComfyUI
2023-11-05 12:41:38 -05:00
comfyanonymous
02f062b5b7
Sanitize unknown node types on load to prevent XSS.
2023-11-05 12:29:28 -05:00
gameltb
7e455adc07
fix unet_wrapper_function name in ModelPatcher
2023-11-05 17:11:44 +08:00
comfyanonymous
1ffa8858e7
Move model sampling code to comfy/model_sampling.py
2023-11-04 01:32:23 -04:00
comfyanonymous
ae2acfc21b
Don't convert Nan to zero.
...
Converting Nan to zero is a bad idea because it makes it hard to tell when
something went wrong.
2023-11-03 13:13:15 -04:00
comfyanonymous
ee74ef5c9e
Increase maximum batch size in LatentRebatch.
2023-11-02 13:07:41 -04:00
Matteo Spinelli
6e84a01ecc
Refactor the template manager ( #1878 )
...
* add drag-drop to node template manager
* better dnd, save field on change
* actually save templates
---------
Co-authored-by: matt3o <matt3o@gmail.com>
2023-11-02 12:29:57 -04:00
comfyanonymous
dd116abfc4
Merge branch 'quantize-dither' of https://github.com/tsone/ComfyUI
2023-11-02 00:57:00 -04:00
comfyanonymous
d2e27b48f1
sampler_cfg_function now gets the noisy output as argument again.
...
This should make things that use sampler_cfg_function behave like before.
Added an input argument for those that want the denoised output.
This means you can calculate the x0 prediction of the model by doing:
(input - cond) for example.
2023-11-01 21:24:08 -04:00
comfyanonymous
2455aaed8a
Allow model or clip to be None in load_lora_for_models.
2023-11-01 20:27:20 -04:00
comfyanonymous
45a3df1cde
Merge branch 'filter-widgets-crash-fix' of https://github.com/Jantolick/ComfyUI
2023-11-01 20:17:25 -04:00
comfyanonymous
ecb80abb58
Allow ModelSamplingDiscrete to be instantiated without a model config.
2023-11-01 19:13:03 -04:00
Joseph Antolick
88410ace9b
fix: handle null case for currentNode widgets to prevent scroll error
2023-11-01 16:52:51 -04:00
comfyanonymous
e73ec8c4da
Not used anymore.
2023-11-01 00:01:30 -04:00
comfyanonymous
111f1b5255
Fix some issues with sampling precision.
2023-10-31 23:49:29 -04:00