comfyanonymous
d66b631d74
Merge branch 'fix-collapsed-clip' of https://github.com/pythongosssss/ComfyUI
2023-11-21 13:26:26 -05:00
comfyanonymous
cd4fc77d5f
Add taesd and taesdxl to VAELoader node.
...
They will show up if both the taesd_encoder and taesd_decoder or taesdxl
model files are present in the models/vae_approx directory.
2023-11-21 12:54:19 -05:00
pythongosssss
89e31abc46
Fix clipping of collapsed nodes
2023-11-21 17:54:01 +00:00
pythongosssss
6ff06fa796
Animated image output support ( #2008 )
...
* Refactor multiline widget into generic DOM widget
* wip webp preview
* webp support
* fix check
* fix sizing
* show image when zoomed out
* Swap webp checkto generic animated image flag
* remove duplicate
* Fix falsy check
2023-11-21 01:33:58 -05:00
comfyanonymous
ce67dcbcda
Make it easy for models to process the unet state dict on load.
2023-11-20 23:17:53 -05:00
comfyanonymous
2dd5b4dd78
Only show last 200 elements in the UI history tab.
2023-11-20 16:56:29 -05:00
comfyanonymous
a03dde190e
Cap maximum history size at 10000. Delete oldest entry when reached.
2023-11-20 16:38:39 -05:00
comfyanonymous
31c5ea7b2c
Add LatentInterpolate to interpolate between latents.
2023-11-20 03:55:51 -05:00
comfyanonymous
dba4f3b4fc
Add a RepeatImageBatch node.
2023-11-19 06:09:01 -05:00
comfyanonymous
d9d8702d8d
percent_to_sigma now returns a float instead of a tensor.
2023-11-18 23:20:29 -05:00
comfyanonymous
8a451234b3
Add ImageCrop node.
2023-11-18 04:44:17 -05:00
comfyanonymous
0cf4e86939
Add some command line arguments to store text encoder weights in fp8.
...
Pytorch supports two variants of fp8:
--fp8_e4m3fn-text-enc (the one that seems to give better results)
--fp8_e5m2-text-enc
2023-11-17 02:56:59 -05:00
comfyanonymous
107e78b1cb
Add support for loading SSD1B diffusers unet version.
...
Improve diffusers model detection.
2023-11-16 23:12:55 -05:00
comfyanonymous
7e3fe3ad28
Make deep shrink behave like it should.
2023-11-16 15:26:28 -05:00
comfyanonymous
9f00a18095
Fix potential issues.
2023-11-16 14:59:54 -05:00
comfyanonymous
bd07ad1861
Add PatchModelAddDownscale (Kohya Deep Shrink) node.
...
By adding a downscale to the unet in the first timesteps this node lets
you generate images at higher resolutions with less consistency issues.
2023-11-16 13:25:46 -05:00
comfyanonymous
7ea6bb038c
Print warning when controlnet can't be applied instead of crashing.
2023-11-16 12:57:12 -05:00
comfyanonymous
dcec1047e6
Invert the start and end percentages in the code.
...
This doesn't affect how percentages behave in the frontend but breaks
things if you relied on them in the backend.
percent_to_sigma goes from 0 to 1.0 instead of 1.0 to 0 for less confusion.
Make percent 0 return an extremely large sigma and percent 1.0 return a
zero one to fix imprecision.
2023-11-16 04:23:44 -05:00
comfyanonymous
7114cfec0e
Always clone graph data when loading to fix some load issues.
2023-11-15 15:55:02 -05:00
comfyanonymous
629e4c552c
Merge branch 'master' of https://github.com/42lux/ComfyUI
2023-11-15 01:47:21 -05:00
comfyanonymous
57eea0efbb
heunpp2 sampler.
2023-11-14 23:50:55 -05:00
42lux
7b87c825a3
Added Colorschemes. Arc, North and Github.
2023-11-15 02:37:35 +01:00
comfyanonymous
728613bb3e
Fix last pr.
2023-11-14 14:41:31 -05:00
comfyanonymous
ec3d0ab432
Merge branch 'master' of https://github.com/Jannchie/ComfyUI
2023-11-14 14:38:07 -05:00
comfyanonymous
c962884a5c
Make bislerp work on GPU.
2023-11-14 11:38:36 -05:00
comfyanonymous
420beeeb05
Clean up and refactor sampler code.
...
This should make it much easier to write custom nodes with kdiffusion type
samplers.
2023-11-14 00:39:34 -05:00
Jianqi Pan
f2e49b1d57
fix: adaptation to older versions of pytroch
2023-11-14 14:32:05 +09:00
comfyanonymous
94cc718e9c
Add a way to add patches to the input block.
2023-11-14 00:08:12 -05:00
comfyanonymous
8509bd58b4
Reorganize custom_sampling nodes.
2023-11-13 21:45:23 -05:00
comfyanonymous
61112c81b9
Add a node to flip the sigmas for unsampling.
2023-11-13 21:45:08 -05:00
comfyanonymous
eb0407e806
Update litegraph to latest.
2023-11-13 16:26:28 -05:00
comfyanonymous
7339479b10
Disable xformers when it can't load properly.
2023-11-13 12:31:10 -05:00
comfyanonymous
f12ec55983
Allow boolean widgets to have no options dict.
2023-11-13 00:42:34 -05:00
pythongosssss
4aeef781a3
Support number/text ids when importing API JSON ( #1952 )
...
* support numeric/text ids
2023-11-12 14:49:23 -05:00
comfyanonymous
4781819a85
Make memory estimation aware of model dtype.
2023-11-12 04:28:26 -05:00
comfyanonymous
dd4ba68b6e
Allow different models to estimate memory usage differently.
2023-11-12 04:03:52 -05:00
comfyanonymous
2c9dba8dc0
sampling_function now has the model object as the argument.
2023-11-12 03:45:10 -05:00
comfyanonymous
8d80584f6a
Remove useless argument from uni_pc sampler.
2023-11-12 01:25:33 -05:00
Jairo Correa
006b24cc32
Prevent image cache
2023-11-11 15:56:14 -03:00
comfyanonymous
248aa3e563
Fix bug.
2023-11-11 12:20:16 -05:00
comfyanonymous
4a8a839b40
Add option to use in place weight updating in ModelPatcher.
2023-11-11 01:11:12 -05:00
comfyanonymous
412d3ff57d
Refactor.
2023-11-11 01:11:06 -05:00
comfyanonymous
ca2812bae0
Fix RescaleCFG for batch size > 1.
2023-11-10 22:05:25 -05:00
comfyanonymous
58d5d71a93
Working RescaleCFG node.
...
This was broken because of recent changes so I fixed it and moved it from
the experiments repo.
2023-11-10 20:52:10 -05:00
comfyanonymous
3e0033ef30
Fix model merge bug.
...
Unload models before getting weights for model patching.
2023-11-10 03:19:05 -05:00
comfyanonymous
002aefa382
Support lcm models.
...
Use the "lcm" sampler to sample them, you also have to use the
ModelSamplingDiscrete node to set them as lcm models to use them properly.
2023-11-09 18:30:22 -05:00
comfyanonymous
ca71e542d2
Lower cfg step to 0.1 in sampler nodes.
2023-11-09 17:35:17 -05:00
pythongosssss
72e3feb573
Load API JSON ( #1932 )
...
* added loading api json
* revert async change
* reorder
2023-11-09 13:33:43 -05:00
comfyanonymous
cd6df8b323
Fix sanitize node name removing the "/" character.
2023-11-09 13:10:19 -05:00
comfyanonymous
ec12000136
Add support for full diff lora keys.
2023-11-08 22:05:31 -05:00