comfyanonymous
7c0f255de1
Clean up percent start/end and make controlnets work with sigmas.
2023-10-31 22:14:32 -04:00
comfyanonymous
a268a574fa
Remove a bunch of useless code.
...
DDIM is the same as euler with a small difference in the inpaint code.
DDIM uses randn_like but I set a fixed seed instead.
I'm keeping it in because I'm sure if I remove it people are going to
complain.
2023-10-31 18:11:29 -04:00
comfyanonymous
1777b54d02
Sampling code changes.
...
apply_model in model_base now returns the denoised output.
This means that sampling_function now computes things on the denoised
output instead of the model output. This should make things more consistent
across current and future models.
2023-10-31 17:33:43 -04:00
comfyanonymous
c837a173fa
Fix some memory issues in sub quad attention.
2023-10-30 15:30:49 -04:00
comfyanonymous
125b03eead
Fix some OOM issues with split attention.
2023-10-30 13:14:11 -04:00
comfyanonymous
a12cc05323
Add --max-upload-size argument, the default is 100MB.
2023-10-29 03:55:46 -04:00
comfyanonymous
2a134bfab9
Fix checkpoint loader with config.
2023-10-27 22:13:55 -04:00
comfyanonymous
e60ca6929a
SD1 and SD2 clip and tokenizer code is now more similar to the SDXL one.
2023-10-27 15:54:04 -04:00
comfyanonymous
6ec3f12c6e
Support SSD1B model and make it easier to support asymmetric unets.
2023-10-27 14:45:15 -04:00
comfyanonymous
434ce25ec0
Restrict loading embeddings from embedding folders.
2023-10-27 02:54:13 -04:00
comfyanonymous
723847f6b3
Faster clip image processing.
2023-10-26 01:53:01 -04:00
comfyanonymous
a373367b0c
Fix some OOM issues with split and sub quad attention.
2023-10-25 20:17:28 -04:00
comfyanonymous
7fbb217d3a
Fix uni_pc returning noisy image when steps <= 3
2023-10-25 16:08:30 -04:00
Jedrzej Kosinski
3783cb8bfd
change 'c_adm' to 'y' in ControlNet.get_control
2023-10-25 08:24:32 -05:00
comfyanonymous
d1d2fea806
Pass extra conds directly to unet.
2023-10-25 00:07:53 -04:00
comfyanonymous
036f88c621
Refactor to make it easier to add custom conds to models.
2023-10-24 23:31:12 -04:00
comfyanonymous
3fce8881ca
Sampling code refactor to make it easier to add more conds.
2023-10-24 03:38:41 -04:00
comfyanonymous
8594c8be4d
Empty the cache when torch cache is more than 25% free mem.
2023-10-22 13:58:12 -04:00
comfyanonymous
8b65f5de54
attention_basic now works with hypertile.
2023-10-22 03:59:53 -04:00
comfyanonymous
e6bc42df46
Make sub_quad and split work with hypertile.
2023-10-22 03:51:29 -04:00
comfyanonymous
a0690f9df9
Fix t2i adapter issue.
2023-10-21 20:31:24 -04:00
comfyanonymous
9906e3efe3
Make xformers work with hypertile.
2023-10-21 13:23:03 -04:00
comfyanonymous
4185324a1d
Fix uni_pc sampler math. This changes the images this sampler produces.
2023-10-20 04:16:53 -04:00
comfyanonymous
e6962120c6
Make sure cond_concat is on the right device.
2023-10-19 01:14:25 -04:00
comfyanonymous
45c972aba8
Refactor cond_concat into conditioning.
2023-10-18 20:36:58 -04:00
comfyanonymous
430a8334c5
Fix some potential issues.
2023-10-18 19:48:36 -04:00
comfyanonymous
782a24fce6
Refactor cond_concat into model object.
2023-10-18 16:48:37 -04:00
comfyanonymous
0d45a565da
Fix memory issue related to control loras.
...
The cleanup function was not getting called.
2023-10-18 02:43:01 -04:00
comfyanonymous
d44a2de49f
Make VAE code closer to sgm.
2023-10-17 15:18:51 -04:00
comfyanonymous
23680a9155
Refactor the attention stuff in the VAE.
2023-10-17 03:19:29 -04:00
comfyanonymous
c8013f73e5
Add some Quadro cards to the list of cards with broken fp16.
2023-10-16 16:48:46 -04:00
comfyanonymous
bb064c9796
Add a separate optimized_attention_masked function.
2023-10-16 02:31:24 -04:00
comfyanonymous
fd4c5f07e7
Add a --bf16-unet to test running the unet in bf16.
2023-10-13 14:51:10 -04:00
comfyanonymous
9a55dadb4c
Refactor code so model can be a dtype other than fp32 or fp16.
2023-10-13 14:41:17 -04:00
comfyanonymous
88733c997f
pytorch_attention_enabled can now return True when xformers is enabled.
2023-10-11 21:30:57 -04:00
comfyanonymous
20d3852aa1
Pull some small changes from the other repo.
2023-10-11 20:38:48 -04:00
comfyanonymous
ac7d8cfa87
Allow attn_mask in attention_pytorch.
2023-10-11 20:38:48 -04:00
comfyanonymous
1a4bd9e9a6
Refactor the attention functions.
...
There's no reason for the whole CrossAttention object to be repeated when
only the operation in the middle changes.
2023-10-11 20:38:48 -04:00
comfyanonymous
8cc75c64ff
Let unet wrapper functions have .to attributes.
2023-10-11 01:34:38 -04:00
comfyanonymous
5e885bd9c8
Cleanup.
2023-10-10 21:46:53 -04:00
comfyanonymous
851bb87ca9
Merge branch 'taesd_safetensors' of https://github.com/mochiya98/ComfyUI
2023-10-10 21:42:35 -04:00
Yukimasa Funaoka
9eb621c95a
Supports TAESD models in safetensors format
2023-10-10 13:21:44 +09:00
comfyanonymous
d1a0abd40b
Merge branch 'input-directory' of https://github.com/jn-jairo/ComfyUI
2023-10-09 01:53:29 -04:00
comfyanonymous
72188dffc3
load_checkpoint_guess_config can now optionally output the model.
2023-10-06 13:48:18 -04:00
Jairo Correa
63e5fd1790
Option to input directory
2023-10-04 19:45:15 -03:00
City
9bfec2bdbf
Fix quality loss due to low precision
2023-10-04 15:40:59 +02:00
badayvedat
0f17993d05
fix: typo in extra sampler
2023-09-29 06:09:59 +03:00
comfyanonymous
66756de100
Add SamplerDPMPP_2M_SDE node.
2023-09-28 21:56:23 -04:00
comfyanonymous
71713888c4
Print missing VAE keys.
2023-09-28 00:54:57 -04:00
comfyanonymous
d234ca558a
Add missing samplers to KSamplerSelect.
2023-09-28 00:17:03 -04:00