kosinkadink1@gmail.com
f5abdc6f86
Merge branch 'master' into patch_hooks
2024-09-14 17:29:30 +09:00
kosinkadink1@gmail.com
5dadd97583
Added default_conds support in calc_cond_batch func
2024-09-14 17:21:50 +09:00
kosinkadink1@gmail.com
f160d46340
Added call to initialize_timesteps on hooks in process_conds func, and added call prepare current keyframe on hooks in calc_cond_batch
2024-09-14 16:10:42 +09:00
kosinkadink1@gmail.com
1268d04295
Consolidated add_hook_patches_as_diffs into add_hook_patches func, fixed fp8 support for model-as-lora feature
2024-09-14 14:09:43 +09:00
comfyanonymous
cf80d28689
Support loading controlnets with different input.
2024-09-13 09:54:37 -04:00
kosinkadink1@gmail.com
9ae758175d
Added current_patcher property to BaseModel
2024-09-13 21:35:35 +09:00
kosinkadink1@gmail.com
3cbd40ada3
Initial changes to calc_cond_batch to eventually support hook_patches
2024-09-13 18:31:52 +09:00
kosinkadink1@gmail.com
069ec7a64b
Added hook_patches to ModelPatcher for weights (model)
2024-09-13 17:20:22 +09:00
Robin Huang
b962db9952
Add cli arg to override user directory ( #4856 )
...
* Override user directory.
* Use overridden user directory.
* Remove prints.
* Remove references to global user_files.
* Remove unused replace_folder function.
* Remove newline.
* Remove global during get_user_directory.
* Add validation.
2024-09-12 08:10:27 -04:00
comfyanonymous
9d720187f1
types -> comfy_types to fix import issue.
2024-09-12 03:57:46 -04:00
comfyanonymous
9f4daca9d9
Doesn't really make sense for cfg_pp sampler to call regular one.
2024-09-11 02:51:36 -04:00
yoinked
b5d0f2a908
Add CFG++ to DPM++ 2S Ancestral ( #3871 )
...
* Update sampling.py
* Update samplers.py
* my bad
* "fix" the sampler
* Update samplers.py
* i named it wrong
* minor sampling improvements
mainly using a dynamic rho value (hey this sounds a lot like smea!!!)
* revert rho change
rho? r? its just 1/2
2024-09-11 02:49:44 -04:00
comfyanonymous
9c5fca75f4
Fix lora issue.
2024-09-08 10:10:47 -04:00
comfyanonymous
32a60a7bac
Support onetrainer text encoder Flux lora.
2024-09-08 09:31:41 -04:00
Jim Winkens
bb52934ba4
Fix import issue ( #4815 )
2024-09-07 05:28:32 -04:00
comfyanonymous
ea77750759
Support a generic Comfy format for text encoder loras.
...
This is a format with keys like:
text_encoders.clip_l.transformer.text_model.encoder.layers.9.self_attn.v_proj.lora_up.weight
Instead of waiting for me to add support for specific lora formats you can
convert your text encoder loras to this format instead.
If you want to see an example save a text encoder lora with the SaveLora
node with the commit right after this one.
2024-09-07 02:20:39 -04:00
comfyanonymous
c27ebeb1c2
Fix onnx export not working on flux.
2024-09-06 03:21:52 -04:00
comfyanonymous
5cbaa9e07c
Mistoline flux controlnet support.
2024-09-05 00:05:17 -04:00
comfyanonymous
c7427375ee
Prioritize freeing partially offloaded models first.
2024-09-04 19:47:32 -04:00
Jedrzej Kosinski
f04229b84d
Add emb_patch support to UNetModel forward ( #4779 )
2024-09-04 14:35:15 -04:00
Silver
f067ad15d1
Make live preview size a configurable launch argument ( #4649 )
...
* Make live preview size a configurable launch argument
* Remove import from testing phase
* Update cli_args.py
2024-09-03 19:16:38 -04:00
comfyanonymous
483004dd1d
Support newer glora format.
2024-09-03 17:02:19 -04:00
comfyanonymous
00a5d08103
Lower fp8 lora memory usage.
2024-09-03 01:25:05 -04:00
comfyanonymous
d043997d30
Flux onetrainer lora.
2024-09-02 08:22:15 -04:00
comfyanonymous
8d31a6632f
Speed up inference on nvidia 10 series on Linux.
2024-09-01 17:29:31 -04:00
comfyanonymous
b643eae08b
Make minimum_inference_memory() depend on --reserve-vram
2024-09-01 01:18:34 -04:00
comfyanonymous
935ae153e1
Cleanup.
2024-08-30 12:53:59 -04:00
Chenlei Hu
e91662e784
Get logs endpoint & system_stats additions ( #4690 )
...
* Add route for getting output logs
* Include ComfyUI version
* Move to own function
* Changed to memory logger
* Unify logger setup logic
* Fix get version git fallback
---------
Co-authored-by: pythongosssss <125205205+pythongosssss@users.noreply.github.com>
2024-08-30 12:46:37 -04:00
comfyanonymous
63fafaef45
Fix potential issue with hydit controlnets.
2024-08-30 04:58:41 -04:00
comfyanonymous
6eb5d64522
Fix glora lowvram issue.
2024-08-29 19:07:23 -04:00
comfyanonymous
10a79e9898
Implement model part of flux union controlnet.
2024-08-29 18:41:22 -04:00
comfyanonymous
ea3f39bd69
InstantX depth flux controlnet.
2024-08-29 02:14:19 -04:00
comfyanonymous
b33cd61070
InstantX canny controlnet.
2024-08-28 19:02:50 -04:00
comfyanonymous
d31e226650
Unify RMSNorm code.
2024-08-28 16:56:38 -04:00
comfyanonymous
38c22e631a
Fix case where model was not properly unloaded in merging workflows.
2024-08-27 19:03:51 -04:00
Chenlei Hu
6bbdcd28ae
Support weight padding on diff weight patch ( #4576 )
2024-08-27 13:55:37 -04:00
comfyanonymous
ab130001a8
Do RMSNorm in native type.
2024-08-27 02:41:56 -04:00
comfyanonymous
2ca8f6e23d
Make the stochastic fp8 rounding reproducible.
2024-08-26 15:12:06 -04:00
comfyanonymous
7985ff88b9
Use less memory in float8 lora patching by doing calculations in fp16.
2024-08-26 14:45:58 -04:00
comfyanonymous
c6812947e9
Fix potential memory leak.
2024-08-26 02:07:32 -04:00
comfyanonymous
9230f65823
Fix some controlnets OOMing when loading.
2024-08-25 05:54:29 -04:00
comfyanonymous
8ae23d8e80
Fix onnx export.
2024-08-23 17:52:47 -04:00
comfyanonymous
7df42b9a23
Fix dora.
2024-08-23 04:58:59 -04:00
comfyanonymous
5d8bbb7281
Cleanup.
2024-08-23 04:06:27 -04:00
comfyanonymous
2c1d2375d6
Fix.
2024-08-23 04:04:55 -04:00
Simon Lui
64ccb3c7e3
Rework IPEX check for future inclusion of XPU into Pytorch upstream and do a bit more optimization of ipex.optimize(). ( #4562 )
2024-08-23 03:59:57 -04:00
Scorpinaus
9465b23432
Added SD15_Inpaint_Diffusers model support for unet_config_from_diffusers_unet function ( #4565 )
2024-08-23 03:57:08 -04:00
comfyanonymous
c0b0da264b
Missing imports.
2024-08-22 17:20:51 -04:00
comfyanonymous
c26ca27207
Move calculate function to comfy.lora
2024-08-22 17:12:00 -04:00
comfyanonymous
7c6bb84016
Code cleanups.
2024-08-22 17:05:12 -04:00