Jedrzej Kosinski
638c4086a3
Fixed inconsistency of results when schedule_clip is set to False, small renaming/typo fixing, added initial support for ControlNet extra_hooks to work in tandem with normal cond hooks, initial work on calc_cond_batch merging all subdicts in returned transformer_options
2024-11-11 08:41:08 -06:00
Jedrzej Kosinski
9dde713347
Refactored hooks in calc_cond_batch to be part of get_area_and_mult tuple, added extra_hooks to ControlBase to allow custom controlnets w/ hooks, small cleanup and renaming
2024-11-04 05:46:27 -06:00
Jedrzej Kosinski
0fbefb8428
Refactored code to store wrappers and callbacks in transformer_options, added apply_model and diffusion_model.forward wrappers
2024-11-03 06:22:48 -06:00
Jedrzej Kosinski
51e8d5554c
Moved WrappersMP/CallbacksMP/WrapperExecutor to patcher_extension.py
2024-11-02 22:21:16 -05:00
Jedrzej Kosinski
e3c3722ec2
Merge branch 'improved_memory' into patch_hooks_improved_memory
2024-11-01 06:02:46 -05:00
comfyanonymous
bd5d8f150f
Prevent and detect some types of memory leaks.
2024-11-01 06:55:42 -04:00
Jedrzej Kosinski
89934a465a
Merge branch 'improved_memory' into patch_hooks_improved_memory
2024-11-01 03:59:43 -05:00
Jedrzej Kosinski
45f16c2dad
Merge branch 'improved_memory' into patch_hooks_improved_memory
2024-11-01 03:58:17 -05:00
comfyanonymous
975927cc79
Remove useless function.
2024-11-01 04:40:33 -04:00
comfyanonymous
1735d4fb01
Fix issue.
2024-11-01 04:25:27 -04:00
Jedrzej Kosinski
16735c98e0
Make encode_from_tokens_scheduled call cleaner, rollback change in model_patcher.py for hook_patches_backup dict
2024-11-01 02:01:45 -05:00
comfyanonymous
d8bd2a9baa
Less fragile memory management.
2024-11-01 02:41:51 -04:00
Jedrzej Kosinski
489846905e
Optimized CLIP hook scheduling to treat same strength as same keyframe
2024-10-31 19:28:16 -05:00
Jedrzej Kosinski
3bcbcce88d
Merge branch 'master' into patch_hooks
2024-10-31 18:34:05 -05:00
Aarni Koskela
1c8286a44b
Avoid SyntaxWarning in UniPC docstring ( #5442 )
2024-10-31 15:17:26 -04:00
comfyanonymous
1af4a47fd1
Bump up mac version for attention upcast bug workaround.
2024-10-31 15:15:31 -04:00
comfyanonymous
daa1565b93
Fix diffusers flux controlnet regression.
2024-10-30 13:11:34 -04:00
Jedrzej Kosinski
7a4d2fe523
Fix range check in get_hooks_for_clip_schedule so that proper keyframes get assigned to corresponding ranges
2024-10-30 05:37:30 -05:00
Jedrzej Kosinski
d5169df808
Added initial support within CLIP Text Encode (Prompt) node for scheduling weight hook CLIP strength via clip_start_percent/clip_end_percent on conds, added schedule_clip toggle to Set CLIP Hooks node, small cleanup/fixes
2024-10-30 04:56:09 -05:00
comfyanonymous
09fdb2b269
Support SD3.5 medium diffusers format weights and loras.
2024-10-30 04:24:00 -04:00
comfyanonymous
30c0c81351
Add a way to patch blocks in SD3.
2024-10-29 00:48:32 -04:00
comfyanonymous
13b0ff8a6f
Update SD3 code.
2024-10-28 21:58:52 -04:00
comfyanonymous
c320801187
Remove useless line.
2024-10-28 17:41:12 -04:00
comfyanonymous
669d9e4c67
Set default shift on mochi to 6.0
2024-10-27 22:21:04 -04:00
comfyanonymous
9ee0a6553a
float16 inference is a bit broken on mochi.
2024-10-27 04:56:40 -04:00
comfyanonymous
5cbb01bc2f
Basic Genmo Mochi video model support.
...
To use:
"Load CLIP" node with t5xxl + type mochi
"Load Diffusion Model" node with the mochi dit file.
"Load VAE" with the mochi vae file.
EmptyMochiLatentVideo node for the latent.
euler + linear_quadratic in the KSampler node.
2024-10-26 06:54:00 -04:00
comfyanonymous
c3ffbae067
Make LatentUpscale nodes work on 3d latents.
2024-10-26 01:50:51 -04:00
Jedrzej Kosinski
2047bf211f
Changed CreateHookModelAsLoraTest to be the new CreateHookModelAsLora, rename old ones as 'direct' and will be removed prior to merge
2024-10-25 19:58:38 -05:00
comfyanonymous
d605677b33
Make euler_ancestral work on flow models (credit: Ashen).
2024-10-25 19:53:44 -04:00
Jedrzej Kosinski
daeb2624a9
Fixed default conds not respecting hook keyframes, made keyframes not reset cache when strength is unchanged, fixed Cond Set Default Combine throwing error, fixed model-as-lora throwing error during calculate_weight after a recent ComfyUI update, small refactoring/scaffolding changes for hooks
2024-10-25 18:32:22 -05:00
PsychoLogicAu
af8cf79a2d
support SimpleTuner lycoris lora for SD3 ( #5340 )
2024-10-24 01:18:32 -04:00
Jedrzej Kosinski
4bbdf2bfe5
Merge branch 'master' into patch_hooks
2024-10-23 21:10:46 -05:00
comfyanonymous
66b0961a46
Fix ControlLora issue with last commit.
2024-10-23 17:02:40 -04:00
comfyanonymous
754597c8a9
Clean up some controlnet code.
...
Remove self.device which was useless.
2024-10-23 14:19:05 -04:00
comfyanonymous
915fdb5745
Fix lowvram edge case.
2024-10-22 16:34:50 -04:00
contentis
5a8a48931a
remove attention abstraction ( #5324 )
2024-10-22 14:02:38 -04:00
comfyanonymous
8ce2a1052c
Optimizations to --fast and scaled fp8.
2024-10-22 02:12:28 -04:00
comfyanonymous
f82314fcfc
Fix duplicate sigmas on beta scheduler.
2024-10-21 20:19:45 -04:00
comfyanonymous
0075c6d096
Mixed precision diffusion models with scaled fp8.
...
This change allows supports for diffusion models where all the linears are
scaled fp8 while the other weights are the original precision.
2024-10-21 18:12:51 -04:00
comfyanonymous
83ca891118
Support scaled fp8 t5xxl model.
2024-10-20 22:27:00 -04:00
comfyanonymous
f9f9faface
Fixed model merging issue with scaled fp8.
2024-10-20 06:24:31 -04:00
comfyanonymous
471cd3eace
fp8 casting is fast on GPUs that support fp8 compute.
2024-10-20 00:54:47 -04:00
comfyanonymous
a68bbafddb
Support diffusion models with scaled fp8 weights.
2024-10-19 23:47:42 -04:00
comfyanonymous
73e3a9e676
Clamp output when rounding weight to prevent Nan.
2024-10-19 19:07:10 -04:00
comfyanonymous
67158994a4
Use the lowvram cast_to function for everything.
2024-10-17 17:25:56 -04:00
comfyanonymous
0bedfb26af
Revert "Fix Transformers FutureWarning ( #5140 )"
...
This reverts commit 95b7cf9bbe
.
2024-10-16 12:36:19 -04:00
comfyanonymous
f584758271
Cleanup some useless lines.
2024-10-14 21:02:39 -04:00
svdc
95b7cf9bbe
Fix Transformers FutureWarning ( #5140 )
...
* Update sd1_clip.py
Fix Transformers FutureWarning
* Update sd1_clip.py
Fix comment
2024-10-14 20:12:20 -04:00
comfyanonymous
3c60ecd7a8
Fix fp8 ops staying enabled.
2024-10-12 14:10:13 -04:00
comfyanonymous
7ae6626723
Remove useless argument.
2024-10-12 07:16:21 -04:00