Commit Graph

1110 Commits

Author SHA1 Message Date
kosinkadink1@gmail.com 6b14fc8795 Merge branch 'master' into patch_hooks 2024-09-17 15:31:03 +09:00
kosinkadink1@gmail.com c29006e669 Initial work on adding 'model_as_lora' lora type to calculate_weight 2024-09-17 15:30:33 +09:00
comfyanonymous 0849c80e2a get_key_patches now works without unloading the model. 2024-09-17 01:57:59 -04:00
kosinkadink1@gmail.com cfb145187d Made Set Clip Hooks node work with hooks from Create Hook nodes, began work on better Create Hook Model As LoRA node 2024-09-17 09:55:14 +09:00
kosinkadink1@gmail.com 4b472ba44c Added support for adding weight hooks that aren't registered on the ModelPatcher at sampling time 2024-09-17 06:22:41 +09:00
kosinkadink1@gmail.com f5c899f42a Fixed MaxSpeed and default conds implementations 2024-09-15 21:00:45 +09:00
comfyanonymous e813abbb2c Long CLIP L support for SDXL, SD3 and Flux.
Use the *CLIPLoader nodes.
2024-09-15 07:59:38 -04:00
kosinkadink1@gmail.com 5a9aa5817c Added initial hook scheduling nodes, small renaming/refactoring 2024-09-15 18:39:31 +09:00
kosinkadink1@gmail.com a5034df6db Made CLIP work with hook patches 2024-09-15 15:47:09 +09:00
kosinkadink1@gmail.com 9ded65a616 Added initial set of hook-related nodes, added code to register hooks for loras/model-as-loras, small renaming/refactoring 2024-09-15 08:33:17 +09:00
comfyanonymous f48e390032 Support AliMama SD3 and Flux inpaint controlnets.
Use the ControlNetInpaintingAliMamaApply node.
2024-09-14 09:05:16 -04:00
kosinkadink1@gmail.com f5abdc6f86 Merge branch 'master' into patch_hooks 2024-09-14 17:29:30 +09:00
kosinkadink1@gmail.com 5dadd97583 Added default_conds support in calc_cond_batch func 2024-09-14 17:21:50 +09:00
kosinkadink1@gmail.com f160d46340 Added call to initialize_timesteps on hooks in process_conds func, and added call prepare current keyframe on hooks in calc_cond_batch 2024-09-14 16:10:42 +09:00
kosinkadink1@gmail.com 1268d04295 Consolidated add_hook_patches_as_diffs into add_hook_patches func, fixed fp8 support for model-as-lora feature 2024-09-14 14:09:43 +09:00
comfyanonymous cf80d28689 Support loading controlnets with different input. 2024-09-13 09:54:37 -04:00
kosinkadink1@gmail.com 9ae758175d Added current_patcher property to BaseModel 2024-09-13 21:35:35 +09:00
kosinkadink1@gmail.com 3cbd40ada3 Initial changes to calc_cond_batch to eventually support hook_patches 2024-09-13 18:31:52 +09:00
kosinkadink1@gmail.com 069ec7a64b Added hook_patches to ModelPatcher for weights (model) 2024-09-13 17:20:22 +09:00
Robin Huang b962db9952
Add cli arg to override user directory (#4856)
* Override user directory.

* Use overridden user directory.

* Remove prints.

* Remove references to global user_files.

* Remove unused replace_folder function.

* Remove newline.

* Remove global during get_user_directory.

* Add validation.
2024-09-12 08:10:27 -04:00
comfyanonymous 9d720187f1 types -> comfy_types to fix import issue. 2024-09-12 03:57:46 -04:00
comfyanonymous 9f4daca9d9 Doesn't really make sense for cfg_pp sampler to call regular one. 2024-09-11 02:51:36 -04:00
yoinked b5d0f2a908
Add CFG++ to DPM++ 2S Ancestral (#3871)
* Update sampling.py

* Update samplers.py

* my bad

* "fix" the sampler

* Update samplers.py

* i named it wrong

* minor sampling improvements

mainly using a dynamic rho value (hey this sounds a lot like smea!!!)

* revert rho change

rho? r? its just 1/2
2024-09-11 02:49:44 -04:00
comfyanonymous 9c5fca75f4 Fix lora issue. 2024-09-08 10:10:47 -04:00
comfyanonymous 32a60a7bac Support onetrainer text encoder Flux lora. 2024-09-08 09:31:41 -04:00
Jim Winkens bb52934ba4
Fix import issue (#4815) 2024-09-07 05:28:32 -04:00
comfyanonymous ea77750759 Support a generic Comfy format for text encoder loras.
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight

Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.

If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
comfyanonymous c27ebeb1c2 Fix onnx export not working on flux. 2024-09-06 03:21:52 -04:00
comfyanonymous 5cbaa9e07c Mistoline flux controlnet support. 2024-09-05 00:05:17 -04:00
comfyanonymous c7427375ee Prioritize freeing partially offloaded models first. 2024-09-04 19:47:32 -04:00
Jedrzej Kosinski f04229b84d
Add emb_patch support to UNetModel forward (#4779) 2024-09-04 14:35:15 -04:00
Silver f067ad15d1
Make live preview size a configurable launch argument (#4649)
* Make live preview size a configurable launch argument

* Remove import from testing phase

* Update cli_args.py
2024-09-03 19:16:38 -04:00
comfyanonymous 483004dd1d Support newer glora format. 2024-09-03 17:02:19 -04:00
comfyanonymous 00a5d08103 Lower fp8 lora memory usage. 2024-09-03 01:25:05 -04:00
comfyanonymous d043997d30 Flux onetrainer lora. 2024-09-02 08:22:15 -04:00
comfyanonymous 8d31a6632f Speed up inference on nvidia 10 series on Linux. 2024-09-01 17:29:31 -04:00
comfyanonymous b643eae08b Make minimum_inference_memory() depend on --reserve-vram 2024-09-01 01:18:34 -04:00
comfyanonymous 935ae153e1 Cleanup. 2024-08-30 12:53:59 -04:00
Chenlei Hu e91662e784
Get logs endpoint & system_stats additions (#4690)
* Add route for getting output logs

* Include ComfyUI version

* Move to own function

* Changed to memory logger

* Unify logger setup logic

* Fix get version git fallback

---------

Co-authored-by: pythongosssss <125205205+pythongosssss@users.noreply.github.com>
2024-08-30 12:46:37 -04:00
comfyanonymous 63fafaef45 Fix potential issue with hydit controlnets. 2024-08-30 04:58:41 -04:00
comfyanonymous 6eb5d64522 Fix glora lowvram issue. 2024-08-29 19:07:23 -04:00
comfyanonymous 10a79e9898 Implement model part of flux union controlnet. 2024-08-29 18:41:22 -04:00
comfyanonymous ea3f39bd69 InstantX depth flux controlnet. 2024-08-29 02:14:19 -04:00
comfyanonymous b33cd61070 InstantX canny controlnet. 2024-08-28 19:02:50 -04:00
comfyanonymous d31e226650 Unify RMSNorm code. 2024-08-28 16:56:38 -04:00
comfyanonymous 38c22e631a Fix case where model was not properly unloaded in merging workflows. 2024-08-27 19:03:51 -04:00
Chenlei Hu 6bbdcd28ae
Support weight padding on diff weight patch (#4576) 2024-08-27 13:55:37 -04:00
comfyanonymous ab130001a8 Do RMSNorm in native type. 2024-08-27 02:41:56 -04:00
comfyanonymous 2ca8f6e23d Make the stochastic fp8 rounding reproducible. 2024-08-26 15:12:06 -04:00
comfyanonymous 7985ff88b9 Use less memory in float8 lora patching by doing calculations in fp16. 2024-08-26 14:45:58 -04:00