Commit Graph

168 Commits

Author SHA1 Message Date
comfyanonymous 5282f56434 Implement Linear hypernetworks.
Add a HypernetworkLoader node to use hypernetworks.
2023-04-23 12:35:25 -04:00
comfyanonymous 6908f9c949 This makes pytorch2.0 attention perform a bit faster. 2023-04-22 14:30:39 -04:00
comfyanonymous 907010e082 Remove some useless code. 2023-04-20 23:58:25 -04:00
comfyanonymous 96b57a9ad6 Don't pass adm to model when it doesn't support it. 2023-04-19 21:11:38 -04:00
comfyanonymous 3696d1699a Add support for GLIGEN textbox model. 2023-04-19 11:06:32 -04:00
comfyanonymous 884ea653c8 Add a way for nodes to set a custom CFG function. 2023-04-17 11:05:15 -04:00
comfyanonymous 73c3e11e83 Fix model_management import so it doesn't get executed twice. 2023-04-15 19:04:33 -04:00
comfyanonymous 81d1f00df3 Some refactoring: from_tokens -> encode_from_tokens 2023-04-15 18:46:58 -04:00
comfyanonymous 719c26c3c9 Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI 2023-04-15 14:16:50 -04:00
BlenderNeko d0b1b6c6bf fixed improper padding 2023-04-15 19:38:21 +02:00
comfyanonymous deb2b93e79 Move code to empty gpu cache to model_management.py 2023-04-15 11:19:07 -04:00
comfyanonymous 04d9bc13af Safely load pickled embeds that don't load with weights_only=True. 2023-04-14 15:33:43 -04:00
BlenderNeko da115bd78d ensure backwards compat with optional args 2023-04-14 21:16:55 +02:00
BlenderNeko 752f7a162b align behavior with old tokenize function 2023-04-14 21:02:45 +02:00
comfyanonymous 334aab05e5 Don't stop workflow if loading embedding fails. 2023-04-14 13:54:00 -04:00
BlenderNeko 73175cf58c split tokenizer from encoder 2023-04-13 22:06:50 +02:00
BlenderNeko 8489cba140 add unique ID per word/embedding for tokenizer 2023-04-13 22:01:01 +02:00
comfyanonymous 92eca60ec9 Fix for new transformers version. 2023-04-09 15:55:21 -04:00
comfyanonymous 1e1875f674 Print xformers version and warning about 0.0.18 2023-04-09 01:31:47 -04:00
comfyanonymous 7e254d2f69 Clarify what --windows-standalone-build does. 2023-04-07 15:52:56 -04:00
comfyanonymous 44fea05064 Cleanup. 2023-04-07 02:31:46 -04:00
comfyanonymous 58ed0f2da4 Fix loading SD1.5 diffusers checkpoint. 2023-04-07 01:30:33 -04:00
comfyanonymous 8b9ac8fedb Merge branch 'master' of https://github.com/sALTaccount/ComfyUI 2023-04-07 01:03:43 -04:00
comfyanonymous 64557d6781 Add a --force-fp32 argument to force fp32 for debugging. 2023-04-07 00:27:54 -04:00
comfyanonymous bceccca0e5 Small refactor. 2023-04-06 23:53:54 -04:00
comfyanonymous 28a7205739 Merge branch 'ipex' of https://github.com/kwaa/ComfyUI-IPEX 2023-04-06 23:45:29 -04:00
藍+85CD 05eeaa2de5
Merge branch 'master' into ipex 2023-04-07 09:11:30 +08:00
EllangoK 28fff5d1db fixes lack of support for multi configs
also adds some metavars to argarse
2023-04-06 19:06:39 -04:00
comfyanonymous f84f2508cc Rename the cors parameter to something more verbose. 2023-04-06 15:24:55 -04:00
EllangoK 48efae1608 makes cors a cli parameter 2023-04-06 15:06:22 -04:00
EllangoK 01c1fc669f set listen flag to listen on all if specifed 2023-04-06 13:19:00 -04:00
藍+85CD 3e2608e12b Fix auto lowvram detection on CUDA 2023-04-06 15:44:05 +08:00
sALTaccount 60127a8304 diffusers loader 2023-04-05 23:57:31 -07:00
藍+85CD 7cb924f684 Use separate variables instead of `vram_state` 2023-04-06 14:24:47 +08:00
藍+85CD 84b9c0ac2f Import intel_extension_for_pytorch as ipex 2023-04-06 12:27:22 +08:00
EllangoK e5e587b1c0 seperates out arg parser and imports args 2023-04-05 23:41:23 -04:00
藍+85CD 37713e3b0a Add basic XPU device support
closed #387
2023-04-05 21:22:14 +08:00
comfyanonymous e46b1c3034 Disable xformers in VAE when xformers == 0.0.18 2023-04-04 22:22:02 -04:00
comfyanonymous 1718730e80 Ignore embeddings when sizes don't match and print a WARNING. 2023-04-04 11:49:29 -04:00
comfyanonymous 23524ad8c5 Remove print. 2023-04-03 22:58:54 -04:00
comfyanonymous 539ff487a8 Pull latest tomesd code from upstream. 2023-04-03 15:49:28 -04:00
comfyanonymous f50b1fec69 Add noise augmentation setting to unCLIPConditioning. 2023-04-03 13:50:29 -04:00
comfyanonymous 809bcc8ceb Add support for unCLIP SD2.x models.
See _for_testing/unclip in the UI for the new nodes.

unCLIPCheckpointLoader is used to load them.

unCLIPConditioning is used to add the image cond and takes as input a
CLIPVisionEncode output which has been moved to the conditioning section.
2023-04-01 23:19:15 -04:00
comfyanonymous 0d972b85e6 This seems to give better quality in tome. 2023-03-31 18:36:18 -04:00
comfyanonymous 18a6c1db33 Add a TomePatchModel node to the _for_testing section.
Tome increases sampling speed at the expense of quality.
2023-03-31 17:19:58 -04:00
comfyanonymous 61ec3c9d5d Add a way to pass options to the transformers blocks. 2023-03-31 13:04:39 -04:00
comfyanonymous afd65d3819 Fix noise mask not working with > 1 batch size on ksamplers. 2023-03-30 03:50:12 -04:00
comfyanonymous b2554bc4dd Split VAE decode batches depending on free memory. 2023-03-29 02:24:37 -04:00
comfyanonymous 0d65cb17b7 Fix ddim_uniform crashing with 37 steps. 2023-03-28 16:29:35 -04:00
Francesco Yoshi Gobbo f55755f0d2
code cleanup 2023-03-27 06:48:09 +02:00