comfyanonymous
754597c8a9
Clean up some controlnet code.
...
Remove self.device which was useless.
2024-10-23 14:19:05 -04:00
comfyanonymous
915fdb5745
Fix lowvram edge case.
2024-10-22 16:34:50 -04:00
contentis
5a8a48931a
remove attention abstraction ( #5324 )
2024-10-22 14:02:38 -04:00
comfyanonymous
8ce2a1052c
Optimizations to --fast and scaled fp8.
2024-10-22 02:12:28 -04:00
comfyanonymous
f82314fcfc
Fix duplicate sigmas on beta scheduler.
2024-10-21 20:19:45 -04:00
comfyanonymous
0075c6d096
Mixed precision diffusion models with scaled fp8.
...
This change allows supports for diffusion models where all the linears are
scaled fp8 while the other weights are the original precision.
2024-10-21 18:12:51 -04:00
comfyanonymous
83ca891118
Support scaled fp8 t5xxl model.
2024-10-20 22:27:00 -04:00
comfyanonymous
f9f9faface
Fixed model merging issue with scaled fp8.
2024-10-20 06:24:31 -04:00
comfyanonymous
471cd3eace
fp8 casting is fast on GPUs that support fp8 compute.
2024-10-20 00:54:47 -04:00
comfyanonymous
a68bbafddb
Support diffusion models with scaled fp8 weights.
2024-10-19 23:47:42 -04:00
comfyanonymous
73e3a9e676
Clamp output when rounding weight to prevent Nan.
2024-10-19 19:07:10 -04:00
comfyanonymous
67158994a4
Use the lowvram cast_to function for everything.
2024-10-17 17:25:56 -04:00
comfyanonymous
0bedfb26af
Revert "Fix Transformers FutureWarning ( #5140 )"
...
This reverts commit 95b7cf9bbe
.
2024-10-16 12:36:19 -04:00
comfyanonymous
f584758271
Cleanup some useless lines.
2024-10-14 21:02:39 -04:00
svdc
95b7cf9bbe
Fix Transformers FutureWarning ( #5140 )
...
* Update sd1_clip.py
Fix Transformers FutureWarning
* Update sd1_clip.py
Fix comment
2024-10-14 20:12:20 -04:00
comfyanonymous
3c60ecd7a8
Fix fp8 ops staying enabled.
2024-10-12 14:10:13 -04:00
comfyanonymous
7ae6626723
Remove useless argument.
2024-10-12 07:16:21 -04:00
comfyanonymous
6632365e16
model_options consistency between functions.
...
weight_dtype -> dtype
2024-10-11 20:51:19 -04:00
Kadir Nar
ad07796777
🐛 Add device to variable c ( #5210 )
2024-10-11 20:37:50 -04:00
Jedrzej Kosinski
1f8d9c040b
Fixed models not being unloaded properly due to current_patcher reference; the current ComfyUI model cleanup code requires that nothing else has a reference to the ModelPatcher instances
2024-10-11 06:50:55 -05:00
comfyanonymous
1b80895285
Make clip loader nodes support loading sd3 t5xxl in lower precision.
...
Add attention mask support in the SD3 text encoder code.
2024-10-10 15:06:15 -04:00
Dr.Lt.Data
5f9d5a244b
Hotfix for the div zero occurrence when memory_used_encode is 0 ( #5121 )
...
https://github.com/comfyanonymous/ComfyUI/issues/5069#issuecomment-2382656368
2024-10-09 23:34:34 -04:00
Jonathan Avila
4b2f0d9413
Increase maximum macOS version to 15.0.1 when forcing upcast attention ( #5191 )
2024-10-09 22:21:41 -04:00
comfyanonymous
e38c94228b
Add a weight_dtype fp8_e4m3fn_fast to the Diffusion Model Loader node.
...
This is used to load weights in fp8 and use fp8 matrix multiplication.
2024-10-09 19:43:17 -04:00
comfyanonymous
203942c8b2
Fix flux doras with diffusers keys.
2024-10-08 19:03:40 -04:00
Jedrzej Kosinski
1e2777bab1
Added uuid to conds in CFGGuider and uuids to transformer_options to allow uniquely identifying conds in batches during sampling
2024-10-08 17:52:01 -05:00
comfyanonymous
8dfa0cc552
Make SD3 fast previews a little better.
2024-10-07 09:19:59 -04:00
Jedrzej Kosinski
4fdfe2f704
Merge branch 'master' into patch_hooks
2024-10-07 04:29:16 -05:00
comfyanonymous
e5ecdfdd2d
Make fast previews for SDXL a little better by adding a bias.
2024-10-06 19:27:04 -04:00
comfyanonymous
7d29fbf74b
Slightly improve the fast previews for flux by adding a bias.
2024-10-06 17:55:46 -04:00
comfyanonymous
7d2467e830
Some minor cleanups.
2024-10-05 13:22:39 -04:00
Jedrzej Kosinski
06fbdb03ef
Merge branch 'master' into patch_hooks
2024-10-05 06:43:43 -05:00
comfyanonymous
6f021d8aa0
Let --verbose have an argument for the log level.
2024-10-04 10:05:34 -04:00
comfyanonymous
d854ed0bcf
Allow using SD3 type te output on flux model.
2024-10-03 09:44:54 -04:00
comfyanonymous
abcd006b8c
Allow more permutations of clip/t5 in dual clip loader.
2024-10-03 09:26:11 -04:00
comfyanonymous
d985d1d7dc
CLIP Loader node now supports clip_l and clip_g only for SD3.
2024-10-02 04:25:17 -04:00
comfyanonymous
d1cdf51e1b
Refactor some of the TE detection code.
2024-10-01 07:08:41 -04:00
comfyanonymous
b4626ab93e
Add simpletuner lycoris format for SD unet.
2024-09-30 06:03:27 -04:00
comfyanonymous
a9e459c2a4
Use torch.nn.functional.linear in RGB preview code.
...
Add an optional bias to the latent RGB preview code.
2024-09-29 11:27:49 -04:00
comfyanonymous
3bb4dec720
Fix issue with loras, lowvram and --fast fp8.
2024-09-28 14:42:32 -04:00
City
8733191563
Flux torch.compile fix ( #5082 )
2024-09-27 22:07:51 -04:00
kosinkadink1@gmail.com
0c8bd63aa9
Added Combine versions of Cond/Cond Pair Set Props nodes, renamed Pair Cond to Cond Pair, fixed default conds never applying hooks (due to hooks key typo)
2024-09-27 14:42:50 +09:00
kosinkadink1@gmail.com
0f7d379d24
Refactored WrapperExecutor code to remove need for WrapperClassExecutor (now gone), added sampler.sample wrapper (pending review, will likely keep but will see what hacks this could currently let me get rid of in ACN/ADE)
2024-09-27 12:14:36 +09:00
kosinkadink1@gmail.com
09cbd69161
Added create_model_options_clone func, modified type annotations to use __future__ so that I can use the better type annotations
2024-09-25 20:29:49 +09:00
kosinkadink1@gmail.com
fd2d572447
Modified ControlNet/T2IAdapter get_control function to receive transformer_options as additional parameter, made the model_options stored in extra_args in inner_sample be a clone of the original model_options instead of same ref
2024-09-25 19:46:33 +09:00
comfyanonymous
bdd4a22a2e
Fix flux TE not loading t5 embeddings.
2024-09-24 22:57:22 -04:00
kosinkadink1@gmail.com
d3229cbba7
Implement basic MemoryCounter system for determing with cached weights due to hooks should be offloaded in hooks_backup
2024-09-24 17:28:18 +09:00
kosinkadink1@gmail.com
c422553b0b
Added get_attachment func on ModelPatcher
2024-09-24 16:20:53 +09:00
chaObserv
479a427a48
Add dpmpp_2m_cfg_pp ( #4992 )
2024-09-24 02:42:56 -04:00
kosinkadink1@gmail.com
da6c0455cc
Added forward_timestep_embed_patch type, added helper functions on ModelPatcher for emb_patch and forward_timestep_embed_patch, added helper functions for removing callbacks/wrappers/additional_models by key, added custom_should_register prop to hooks
2024-09-24 12:40:54 +09:00
comfyanonymous
3a0eeee320
Make --listen listen on both ipv4 and ipv6 at the same time by default.
2024-09-23 04:38:19 -04:00
comfyanonymous
9c41bc8d10
Remove useless line.
2024-09-23 02:32:29 -04:00
kosinkadink1@gmail.com
7c86407619
Refactored callbacks+wrappers to allow storing lists by id
2024-09-22 16:36:40 +09:00
comfyanonymous
7a415f47a9
Add an optional VAE input to the ControlNetApplyAdvanced node.
...
Deprecate the other controlnet nodes.
2024-09-22 01:24:52 -04:00
kosinkadink1@gmail.com
a154d0df23
Merge branch 'master' into patch_hooks
2024-09-22 11:54:55 +09:00
kosinkadink1@gmail.com
5052a78be2
Added WrapperExecutor for non-classbound functions, added calc_cond_batch wrappers
2024-09-22 11:52:35 +09:00
kosinkadink1@gmail.com
298397d198
Updated clone_has_same_weights function to account for new ModelPatcher properties, improved AutoPatcherEjector usage in partially_load
2024-09-21 21:50:51 +09:00
comfyanonymous
dc96a1ae19
Load controlnet in fp8 if weights are in fp8.
2024-09-21 04:50:12 -04:00
kosinkadink1@gmail.com
f28d892c16
Fix skip_until_exit logic bug breaking injection after first run of model
2024-09-21 16:34:40 +09:00
comfyanonymous
2d810b081e
Add load_controlnet_state_dict function.
2024-09-21 01:51:51 -04:00
comfyanonymous
9f7e9f0547
Add an error message when a controlnet needs a VAE but none is given.
2024-09-21 01:33:18 -04:00
kosinkadink1@gmail.com
5f450d3351
Started scaffolding for other hook types, refactored get_hooks_from_cond to organize hooks by type
2024-09-21 10:37:18 +09:00
kosinkadink1@gmail.com
59d72b4050
Added wrappers to ModelPatcher to facilitate standardized function wrapping
2024-09-20 20:05:29 +09:00
comfyanonymous
70a708d726
Fix model merging issue.
2024-09-20 02:31:44 -04:00
yoinked
e7d4782736
add laplace scheduler [2407.03297] ( #4990 )
...
* add laplace scheduler [2407.03297]
* should be here instead lol
* better settings
2024-09-19 23:23:09 -04:00
kosinkadink1@gmail.com
55014293b1
Added injections support to ModelPatcher + necessary bookkeeping, added additional_models support in ModelPatcher, conds, and hooks
2024-09-19 21:43:58 +09:00
comfyanonymous
ad66f7c7d8
Add model_options to load_controlnet function.
2024-09-19 08:23:35 -04:00
Simon Lui
de8e8e3b0d
Fix xpu Pytorch nightly build from calling optimize which doesn't exist. ( #4978 )
2024-09-19 05:11:42 -04:00
kosinkadink1@gmail.com
e80dc96627
Fix incorrect ref to create_hook_patches_clone after moving function
2024-09-19 11:57:19 +09:00
kosinkadink1@gmail.com
787ef34842
Continued work on simpler Create Hook Model As LoRA node, started to implement ModelPatcher callbacks, attachments, and additional_models
2024-09-19 11:47:25 +09:00
pharmapsychotic
0b7dfa986d
Improve tiling calculations to reduce number of tiles that need to be processed. ( #4944 )
2024-09-17 03:51:10 -04:00
comfyanonymous
d514bb38ee
Add some option to model_options for the text encoder.
...
load_device, offload_device and the initial_device can now be set.
2024-09-17 03:49:54 -04:00
kosinkadink1@gmail.com
6b14fc8795
Merge branch 'master' into patch_hooks
2024-09-17 15:31:03 +09:00
kosinkadink1@gmail.com
c29006e669
Initial work on adding 'model_as_lora' lora type to calculate_weight
2024-09-17 15:30:33 +09:00
comfyanonymous
0849c80e2a
get_key_patches now works without unloading the model.
2024-09-17 01:57:59 -04:00
kosinkadink1@gmail.com
cfb145187d
Made Set Clip Hooks node work with hooks from Create Hook nodes, began work on better Create Hook Model As LoRA node
2024-09-17 09:55:14 +09:00
kosinkadink1@gmail.com
4b472ba44c
Added support for adding weight hooks that aren't registered on the ModelPatcher at sampling time
2024-09-17 06:22:41 +09:00
kosinkadink1@gmail.com
f5c899f42a
Fixed MaxSpeed and default conds implementations
2024-09-15 21:00:45 +09:00
comfyanonymous
e813abbb2c
Long CLIP L support for SDXL, SD3 and Flux.
...
Use the *CLIPLoader nodes.
2024-09-15 07:59:38 -04:00
kosinkadink1@gmail.com
5a9aa5817c
Added initial hook scheduling nodes, small renaming/refactoring
2024-09-15 18:39:31 +09:00
kosinkadink1@gmail.com
a5034df6db
Made CLIP work with hook patches
2024-09-15 15:47:09 +09:00
kosinkadink1@gmail.com
9ded65a616
Added initial set of hook-related nodes, added code to register hooks for loras/model-as-loras, small renaming/refactoring
2024-09-15 08:33:17 +09:00
comfyanonymous
f48e390032
Support AliMama SD3 and Flux inpaint controlnets.
...
Use the ControlNetInpaintingAliMamaApply node.
2024-09-14 09:05:16 -04:00
kosinkadink1@gmail.com
f5abdc6f86
Merge branch 'master' into patch_hooks
2024-09-14 17:29:30 +09:00
kosinkadink1@gmail.com
5dadd97583
Added default_conds support in calc_cond_batch func
2024-09-14 17:21:50 +09:00
kosinkadink1@gmail.com
f160d46340
Added call to initialize_timesteps on hooks in process_conds func, and added call prepare current keyframe on hooks in calc_cond_batch
2024-09-14 16:10:42 +09:00
kosinkadink1@gmail.com
1268d04295
Consolidated add_hook_patches_as_diffs into add_hook_patches func, fixed fp8 support for model-as-lora feature
2024-09-14 14:09:43 +09:00
comfyanonymous
cf80d28689
Support loading controlnets with different input.
2024-09-13 09:54:37 -04:00
kosinkadink1@gmail.com
9ae758175d
Added current_patcher property to BaseModel
2024-09-13 21:35:35 +09:00
kosinkadink1@gmail.com
3cbd40ada3
Initial changes to calc_cond_batch to eventually support hook_patches
2024-09-13 18:31:52 +09:00
kosinkadink1@gmail.com
069ec7a64b
Added hook_patches to ModelPatcher for weights (model)
2024-09-13 17:20:22 +09:00
Robin Huang
b962db9952
Add cli arg to override user directory ( #4856 )
...
* Override user directory.
* Use overridden user directory.
* Remove prints.
* Remove references to global user_files.
* Remove unused replace_folder function.
* Remove newline.
* Remove global during get_user_directory.
* Add validation.
2024-09-12 08:10:27 -04:00
comfyanonymous
9d720187f1
types -> comfy_types to fix import issue.
2024-09-12 03:57:46 -04:00
comfyanonymous
9f4daca9d9
Doesn't really make sense for cfg_pp sampler to call regular one.
2024-09-11 02:51:36 -04:00
yoinked
b5d0f2a908
Add CFG++ to DPM++ 2S Ancestral ( #3871 )
...
* Update sampling.py
* Update samplers.py
* my bad
* "fix" the sampler
* Update samplers.py
* i named it wrong
* minor sampling improvements
mainly using a dynamic rho value (hey this sounds a lot like smea!!!)
* revert rho change
rho? r? its just 1/2
2024-09-11 02:49:44 -04:00
comfyanonymous
9c5fca75f4
Fix lora issue.
2024-09-08 10:10:47 -04:00
comfyanonymous
32a60a7bac
Support onetrainer text encoder Flux lora.
2024-09-08 09:31:41 -04:00
Jim Winkens
bb52934ba4
Fix import issue ( #4815 )
2024-09-07 05:28:32 -04:00
comfyanonymous
ea77750759
Support a generic Comfy format for text encoder loras.
...
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight
Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.
If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
comfyanonymous
c27ebeb1c2
Fix onnx export not working on flux.
2024-09-06 03:21:52 -04:00