comfyanonymous
5c69cde037
Load TE model straight to vram if certain conditions are met.
2024-08-11 23:52:43 -04:00
comfyanonymous
e9589d6d92
Add a way to set model dtype and ops from load_checkpoint_guess_config.
2024-08-11 08:50:34 -04:00
comfyanonymous
0d82a798a5
Remove the ckpt_path from load_state_dict_guess_config.
2024-08-11 08:37:35 -04:00
ljleb
925fff26fd
alternative to `load_checkpoint_guess_config` that accepts a loaded state dict ( #4249 )
...
* make alternative fn
* add back ckpt path as 2nd argument?
2024-08-11 08:36:52 -04:00
comfyanonymous
75b9b55b22
Fix issues with #4302 and support loading diffusers format flux.
2024-08-10 21:28:24 -04:00
Jaret Burkett
1765f1c60c
FLUX: Added full diffusers mapping for FLUX.1 schnell and dev. Adds full LoRA support from diffusers LoRAs. ( #4302 )
2024-08-10 21:26:41 -04:00
comfyanonymous
1de69fe4d5
Fix some issues with inference slowing down.
2024-08-10 16:21:25 -04:00
comfyanonymous
ae197f651b
Speed up hunyuan dit inference a bit.
2024-08-10 07:36:27 -04:00
comfyanonymous
1b5b8ca81a
Fix regression.
2024-08-09 21:45:21 -04:00
comfyanonymous
6678d5cf65
Fix regression.
2024-08-09 14:02:38 -04:00
TTPlanetPig
e172564eea
Update controlnet.py to fix the default controlnet weight as constant ( #4285 )
2024-08-09 13:40:05 -04:00
comfyanonymous
a3cc326748
Better fix for lowvram issue.
2024-08-09 12:16:25 -04:00
comfyanonymous
86a97e91fc
Fix controlnet regression.
2024-08-09 12:08:58 -04:00
comfyanonymous
5acdadc9f3
Fix issue with some lowvram weights.
2024-08-09 03:58:28 -04:00
comfyanonymous
55ad9d5f8c
Fix regression.
2024-08-09 03:36:40 -04:00
comfyanonymous
a9f04edc58
Implement text encoder part of HunyuanDiT loras.
2024-08-09 03:21:10 -04:00
comfyanonymous
a475ec2300
Cleanup HunyuanDit controlnets.
...
Use the: ControlNetApply SD3 and HunyuanDiT node.
2024-08-09 02:59:34 -04:00
来新璐
06eb9fb426
feat: add support for HunYuanDit ControlNet ( #4245 )
...
* add support for HunYuanDit ControlNet
* fix hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix code format style
* add control_weight support for HunyuanDit Controlnet
* use control_weights in HunyuanDit Controlnet
* fix typo
2024-08-09 02:59:24 -04:00
comfyanonymous
413322645e
Raw torch is faster than einops?
2024-08-08 22:09:29 -04:00
comfyanonymous
11200de970
Cleaner code.
2024-08-08 20:07:09 -04:00
comfyanonymous
037c38eb0f
Try to improve inference speed on some machines.
2024-08-08 17:29:27 -04:00
comfyanonymous
1e11d2d1f5
Better prints.
2024-08-08 17:29:27 -04:00
Alex "mcmonkey" Goodwin
65ea6be38f
PullRequest CI Run: use pull_request_target to allow the CI Dashboard to work ( #4277 )
...
'_target' allows secrets to pass through, and we're just using the secret that allows uploading to the dashboard and are manually vetting PRs before running this workflow anyway
2024-08-08 17:20:48 -04:00
Alex "mcmonkey" Goodwin
5df6f57b5d
minor fix on copypasta action name ( #4276 )
...
my bad sorry
2024-08-08 16:30:59 -04:00
Alex "mcmonkey" Goodwin
6588bfdef9
add GitHub workflow for CI tests of PRs ( #4275 )
...
When the 'Run-CI-Test' label is added to a PR, it will be tested by the CI, on a small matrix of stable versions.
2024-08-08 16:24:49 -04:00
Alex "mcmonkey" Goodwin
50ed2879ef
Add full CI test matrix GitHub Workflow ( #4274 )
...
automatically runs a matrix of full GPU-enabled tests on all new commits to the ComfyUI master branch
2024-08-08 15:40:07 -04:00
comfyanonymous
66d4233210
Fix.
2024-08-08 15:16:51 -04:00
comfyanonymous
591010b7ef
Support diffusers text attention flux loras.
2024-08-08 14:45:52 -04:00
comfyanonymous
08f92d55e9
Partial model shift support.
2024-08-08 14:45:06 -04:00
comfyanonymous
8115d8cce9
Add Flux fp16 support hack.
2024-08-07 15:08:39 -04:00
comfyanonymous
6969fc9ba4
Make supported_dtypes a priority list.
2024-08-07 15:00:06 -04:00
comfyanonymous
cb7c4b4be3
Workaround for lora OOM on lowvram mode.
2024-08-07 14:30:54 -04:00
comfyanonymous
1208863eca
Fix "Comfy" lora keys.
...
They are in this format now:
diffusion_model.full.model.key.name.lora_up.weight
2024-08-07 13:49:31 -04:00
comfyanonymous
e1c528196e
Fix bundled embed.
2024-08-07 13:30:45 -04:00
comfyanonymous
17030fd4c0
Support for "Comfy" lora format.
...
The keys are just: model.full.model.key.name.lora_up.weight
It is supported by all comfyui supported models.
Now people can just convert loras to this format instead of having to ask
for me to implement them.
2024-08-07 13:18:32 -04:00
comfyanonymous
c19dcd362f
Controlnet code refactor.
2024-08-07 12:59:28 -04:00
comfyanonymous
1c08bf35b4
Support format for embeddings bundled in loras.
2024-08-07 03:45:25 -04:00
PhilWun
2a02546e20
Add type hints to folder_paths.py ( #4191 )
...
* add type hints to folder_paths.py
* replace deprecated standard collections type hints
* fix type error when using Python 3.8
2024-08-06 21:59:34 -04:00
comfyanonymous
b334605a66
Fix OOMs happening in some cases.
...
A cloned model patcher sometimes reported a model was loaded on a device
when it wasn't.
2024-08-06 13:36:04 -04:00
comfyanonymous
de17a9755e
Unload all models if there's an OOM error.
2024-08-06 03:30:28 -04:00
comfyanonymous
c14ac98fed
Unload models and load them back in lowvram mode no free vram.
2024-08-06 03:22:39 -04:00
Robin Huang
2894511893
Clone taesd with depth of 1 to reduce download size. ( #4232 )
2024-08-06 01:46:09 -04:00
Silver
f3bc40223a
Add format metadata to CLIP save to make compatible with diffusers safetensors loading ( #4233 )
2024-08-06 01:45:24 -04:00
Chenlei Hu
841e74ac40
Change browser test CI python to 3.8 ( #4234 )
2024-08-06 01:27:28 -04:00
comfyanonymous
2d75df45e6
Flux tweak memory usage.
2024-08-05 21:58:28 -04:00
Robin Huang
1abc9c8703
Stable release uses cached dependencies ( #4231 )
...
* Release stable based on existing tag.
* Update default cuda to 12.1.
2024-08-05 20:07:16 -04:00
comfyanonymous
8edbcf5209
Improve performance on some lowend GPUs.
2024-08-05 16:24:04 -04:00
comfyanonymous
e545a636ba
This probably doesn't work anymore.
2024-08-05 12:31:42 -04:00
bymyself
33e5203a2a
Don't cache index.html ( #4211 )
2024-08-05 12:25:28 -04:00
a-One-Fan
a178e25912
Fix Flux FP64 math on XPU ( #4210 )
2024-08-05 01:26:20 -04:00