comfyanonymous
c50208a703
Refactor more code to sample.py
2023-04-24 23:25:51 -04:00
comfyanonymous
7983b3a975
This is cleaner this way.
2023-04-24 22:45:35 -04:00
BlenderNeko
0b07b2cc0f
gligen tuple
2023-04-24 21:47:57 +02:00
pythongosssss
c8c9926eeb
Add progress to vae decode tiled
2023-04-24 11:55:44 +01:00
BlenderNeko
d9b1595f85
made sample functions more explicit
2023-04-24 12:53:10 +02:00
BlenderNeko
5818539743
add docstrings
2023-04-23 20:09:09 +02:00
BlenderNeko
8d2de420d3
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
2023-04-23 20:02:18 +02:00
BlenderNeko
2a09e2aa27
refactor/split various bits of code for sampling
2023-04-23 20:02:08 +02:00
comfyanonymous
5282f56434
Implement Linear hypernetworks.
...
Add a HypernetworkLoader node to use hypernetworks.
2023-04-23 12:35:25 -04:00
comfyanonymous
6908f9c949
This makes pytorch2.0 attention perform a bit faster.
2023-04-22 14:30:39 -04:00
comfyanonymous
907010e082
Remove some useless code.
2023-04-20 23:58:25 -04:00
comfyanonymous
96b57a9ad6
Don't pass adm to model when it doesn't support it.
2023-04-19 21:11:38 -04:00
comfyanonymous
3696d1699a
Add support for GLIGEN textbox model.
2023-04-19 11:06:32 -04:00
comfyanonymous
884ea653c8
Add a way for nodes to set a custom CFG function.
2023-04-17 11:05:15 -04:00
comfyanonymous
73c3e11e83
Fix model_management import so it doesn't get executed twice.
2023-04-15 19:04:33 -04:00
comfyanonymous
81d1f00df3
Some refactoring: from_tokens -> encode_from_tokens
2023-04-15 18:46:58 -04:00
comfyanonymous
719c26c3c9
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
2023-04-15 14:16:50 -04:00
BlenderNeko
d0b1b6c6bf
fixed improper padding
2023-04-15 19:38:21 +02:00
comfyanonymous
deb2b93e79
Move code to empty gpu cache to model_management.py
2023-04-15 11:19:07 -04:00
comfyanonymous
04d9bc13af
Safely load pickled embeds that don't load with weights_only=True.
2023-04-14 15:33:43 -04:00
BlenderNeko
da115bd78d
ensure backwards compat with optional args
2023-04-14 21:16:55 +02:00
BlenderNeko
752f7a162b
align behavior with old tokenize function
2023-04-14 21:02:45 +02:00
comfyanonymous
334aab05e5
Don't stop workflow if loading embedding fails.
2023-04-14 13:54:00 -04:00
BlenderNeko
73175cf58c
split tokenizer from encoder
2023-04-13 22:06:50 +02:00
BlenderNeko
8489cba140
add unique ID per word/embedding for tokenizer
2023-04-13 22:01:01 +02:00
comfyanonymous
92eca60ec9
Fix for new transformers version.
2023-04-09 15:55:21 -04:00
comfyanonymous
1e1875f674
Print xformers version and warning about 0.0.18
2023-04-09 01:31:47 -04:00
comfyanonymous
7e254d2f69
Clarify what --windows-standalone-build does.
2023-04-07 15:52:56 -04:00
comfyanonymous
44fea05064
Cleanup.
2023-04-07 02:31:46 -04:00
comfyanonymous
58ed0f2da4
Fix loading SD1.5 diffusers checkpoint.
2023-04-07 01:30:33 -04:00
comfyanonymous
8b9ac8fedb
Merge branch 'master' of https://github.com/sALTaccount/ComfyUI
2023-04-07 01:03:43 -04:00
comfyanonymous
64557d6781
Add a --force-fp32 argument to force fp32 for debugging.
2023-04-07 00:27:54 -04:00
comfyanonymous
bceccca0e5
Small refactor.
2023-04-06 23:53:54 -04:00
comfyanonymous
28a7205739
Merge branch 'ipex' of https://github.com/kwaa/ComfyUI-IPEX
2023-04-06 23:45:29 -04:00
藍+85CD
05eeaa2de5
Merge branch 'master' into ipex
2023-04-07 09:11:30 +08:00
EllangoK
28fff5d1db
fixes lack of support for multi configs
...
also adds some metavars to argarse
2023-04-06 19:06:39 -04:00
comfyanonymous
f84f2508cc
Rename the cors parameter to something more verbose.
2023-04-06 15:24:55 -04:00
EllangoK
48efae1608
makes cors a cli parameter
2023-04-06 15:06:22 -04:00
EllangoK
01c1fc669f
set listen flag to listen on all if specifed
2023-04-06 13:19:00 -04:00
藍+85CD
3e2608e12b
Fix auto lowvram detection on CUDA
2023-04-06 15:44:05 +08:00
sALTaccount
60127a8304
diffusers loader
2023-04-05 23:57:31 -07:00
藍+85CD
7cb924f684
Use separate variables instead of `vram_state`
2023-04-06 14:24:47 +08:00
藍+85CD
84b9c0ac2f
Import intel_extension_for_pytorch as ipex
2023-04-06 12:27:22 +08:00
EllangoK
e5e587b1c0
seperates out arg parser and imports args
2023-04-05 23:41:23 -04:00
藍+85CD
37713e3b0a
Add basic XPU device support
...
closed #387
2023-04-05 21:22:14 +08:00
comfyanonymous
e46b1c3034
Disable xformers in VAE when xformers == 0.0.18
2023-04-04 22:22:02 -04:00
comfyanonymous
1718730e80
Ignore embeddings when sizes don't match and print a WARNING.
2023-04-04 11:49:29 -04:00
comfyanonymous
23524ad8c5
Remove print.
2023-04-03 22:58:54 -04:00
comfyanonymous
539ff487a8
Pull latest tomesd code from upstream.
2023-04-03 15:49:28 -04:00
comfyanonymous
f50b1fec69
Add noise augmentation setting to unCLIPConditioning.
2023-04-03 13:50:29 -04:00
comfyanonymous
809bcc8ceb
Add support for unCLIP SD2.x models.
...
See _for_testing/unclip in the UI for the new nodes.
unCLIPCheckpointLoader is used to load them.
unCLIPConditioning is used to add the image cond and takes as input a
CLIPVisionEncode output which has been moved to the conditioning section.
2023-04-01 23:19:15 -04:00
comfyanonymous
0d972b85e6
This seems to give better quality in tome.
2023-03-31 18:36:18 -04:00
comfyanonymous
18a6c1db33
Add a TomePatchModel node to the _for_testing section.
...
Tome increases sampling speed at the expense of quality.
2023-03-31 17:19:58 -04:00
comfyanonymous
61ec3c9d5d
Add a way to pass options to the transformers blocks.
2023-03-31 13:04:39 -04:00
comfyanonymous
afd65d3819
Fix noise mask not working with > 1 batch size on ksamplers.
2023-03-30 03:50:12 -04:00
comfyanonymous
b2554bc4dd
Split VAE decode batches depending on free memory.
2023-03-29 02:24:37 -04:00
comfyanonymous
0d65cb17b7
Fix ddim_uniform crashing with 37 steps.
2023-03-28 16:29:35 -04:00
Francesco Yoshi Gobbo
f55755f0d2
code cleanup
2023-03-27 06:48:09 +02:00
Francesco Yoshi Gobbo
cf0098d539
no lowvram state if cpu only
2023-03-27 04:51:18 +02:00
comfyanonymous
f5365c9c81
Fix ddim for Mac: #264
2023-03-26 00:36:54 -04:00
comfyanonymous
4adcea7228
I don't think controlnets were being handled correctly by MPS.
2023-03-24 14:33:16 -04:00
comfyanonymous
3c6ff8821c
Merge branch 'master' of https://github.com/GaidamakUA/ComfyUI
2023-03-24 13:56:43 -04:00
Yurii Mazurevich
fc71e7ea08
Fixed typo
2023-03-24 19:39:55 +02:00
comfyanonymous
7f0fd99b5d
Make ddim work with --cpu
2023-03-24 11:39:51 -04:00
Yurii Mazurevich
4b943d2b60
Removed unnecessary comment
2023-03-24 14:15:30 +02:00
Yurii Mazurevich
89fd5ed574
Added MPS device support
2023-03-24 14:12:56 +02:00
comfyanonymous
dd095efc2c
Support loha that use cp decomposition.
2023-03-23 04:32:25 -04:00
comfyanonymous
94a7c895f4
Add loha support.
2023-03-23 03:40:12 -04:00
comfyanonymous
3ed4a4e4e6
Try again with vae tiled decoding if regular fails because of OOM.
2023-03-22 14:49:00 -04:00
comfyanonymous
4039616ca6
Less seams in tiled outputs at the cost of more processing.
2023-03-22 03:29:09 -04:00
comfyanonymous
c692509c2b
Try to improve VAEEncode memory usage a bit.
2023-03-22 02:45:18 -04:00
comfyanonymous
9d0665c8d0
Add laptop quadro cards to fp32 list.
2023-03-21 16:57:35 -04:00
comfyanonymous
cc309568e1
Add support for locon mid weights.
2023-03-21 14:51:51 -04:00
comfyanonymous
edfc4ca663
Try to fix a vram issue with controlnets.
2023-03-19 10:50:38 -04:00
comfyanonymous
b4b21be707
Fix area composition feathering not working properly.
2023-03-19 02:00:52 -04:00
comfyanonymous
50099bcd96
Support multiple paths for embeddings.
2023-03-18 03:08:43 -04:00
comfyanonymous
2e73367f45
Merge T2IAdapterLoader and ControlNetLoader.
...
Workflows will be auto updated.
2023-03-17 18:17:59 -04:00
comfyanonymous
ee46bef03a
Make --cpu have priority over everything else.
2023-03-13 21:30:01 -04:00
comfyanonymous
0e836d525e
use half() on fp16 models loaded with config.
2023-03-13 21:12:48 -04:00
comfyanonymous
986dd820dc
Use half() function on model when loading in fp16.
2023-03-13 20:58:09 -04:00
comfyanonymous
54dbfaf2ec
Remove omegaconf dependency and some ci changes.
2023-03-13 14:49:18 -04:00
comfyanonymous
83f23f82b8
Add pytorch attention support to VAE.
2023-03-13 12:45:54 -04:00
comfyanonymous
a256a2abde
--disable-xformers should not even try to import xformers.
2023-03-13 11:36:48 -04:00
comfyanonymous
0f3ba7482f
Xformers is now properly disabled when --cpu used.
...
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
2023-03-12 15:44:16 -04:00
comfyanonymous
e33dc2b33b
Add a VAEEncodeTiled node.
2023-03-11 15:28:15 -05:00
comfyanonymous
1de86851b1
Try to fix memory issue.
2023-03-11 15:15:13 -05:00
comfyanonymous
2b1fce2943
Make tiled_scale work for downscaling.
2023-03-11 14:58:55 -05:00
comfyanonymous
9db2e97b47
Tiled upscaling with the upscale models.
2023-03-11 14:04:13 -05:00
comfyanonymous
cd64111c83
Add locon support.
2023-03-09 21:41:24 -05:00
comfyanonymous
c70f0ac64b
SD2.x controlnets now work.
2023-03-08 01:13:38 -05:00
comfyanonymous
19415c3ace
Relative imports to test something.
2023-03-07 11:00:35 -05:00
edikius
165be5828a
Fixed import ( #44 )
...
* fixed import error
I had an
ImportError: cannot import name 'Protocol' from 'typing'
while trying to update so I fixed it to start an app
* Update main.py
* deleted example files
2023-03-06 11:41:40 -05:00
comfyanonymous
501f19eec6
Fix clip_skip no longer being loaded from yaml file.
2023-03-06 11:34:02 -05:00
comfyanonymous
afff30fc0a
Add --cpu to use the cpu for inference.
2023-03-06 10:50:50 -05:00
comfyanonymous
47acb3d73e
Implement support for t2i style model.
...
It needs the CLIPVision model so I added CLIPVisionLoader and CLIPVisionEncode.
Put the clip vision model in models/clip_vision
Put the t2i style model in models/style_models
StyleModelLoader to load it, StyleModelApply to apply it
ConditioningAppend to append the conditioning it outputs to a positive one.
2023-03-05 18:39:25 -05:00
comfyanonymous
cc8baf1080
Make VAE use common function to get free memory.
2023-03-05 14:20:07 -05:00
comfyanonymous
798c90e1c0
Fix pytorch 2.0 cross attention not working.
2023-03-05 14:14:54 -05:00
comfyanonymous
16130c7546
Add support for new colour T2I adapter model.
2023-03-03 19:13:07 -05:00
comfyanonymous
9d00235b41
Update T2I adapter code to latest.
2023-03-03 18:46:49 -05:00
comfyanonymous
ebfcf0a9c9
Fix issue.
2023-03-03 13:18:01 -05:00