comfyanonymous
74e124f4d7
Fix some issues with TE being in lowvram mode.
2024-08-12 23:42:21 -04:00
comfyanonymous
a562c17e8a
load_unet -> load_diffusion_model with a model_options argument.
2024-08-12 23:20:57 -04:00
comfyanonymous
5942c17d55
Order of operations matters.
2024-08-12 21:56:18 -04:00
comfyanonymous
c032b11e07
xlabs Flux controlnet implementation. ( #4260 )
...
* xlabs Flux controlnet.
* Fix not working on old python.
* Remove comment.
2024-08-12 21:22:22 -04:00
comfyanonymous
b8ffb2937f
Memory tweaks.
2024-08-12 15:07:11 -04:00
Vladimir Semyonov
ce37c11164
add DS_Store to gitignore ( #4324 )
2024-08-12 12:32:34 -04:00
Alex "mcmonkey" Goodwin
b5c3906b38
Automatically link the Comfy CI page on PRs ( #4326 )
...
also use_prior_commit so it doesn't get a janked merge commit instead of the real one
2024-08-12 12:32:16 -04:00
comfyanonymous
5d43e75e5b
Fix some issues with the model sometimes not getting patched.
2024-08-12 12:27:54 -04:00
comfyanonymous
517f4a94e4
Fix some lora loading slowdowns.
2024-08-12 11:50:32 -04:00
comfyanonymous
52a471c5c7
Change name of log.
2024-08-12 10:35:06 -04:00
comfyanonymous
ad76574cb8
Fix some potential issues with the previous commits.
2024-08-12 00:23:29 -04:00
comfyanonymous
9acfe4df41
Support loading directly to vram with CLIPLoader node.
2024-08-12 00:06:01 -04:00
comfyanonymous
9829b013ea
Fix mistake in last commit.
2024-08-12 00:00:17 -04:00
comfyanonymous
5c69cde037
Load TE model straight to vram if certain conditions are met.
2024-08-11 23:52:43 -04:00
comfyanonymous
e9589d6d92
Add a way to set model dtype and ops from load_checkpoint_guess_config.
2024-08-11 08:50:34 -04:00
comfyanonymous
0d82a798a5
Remove the ckpt_path from load_state_dict_guess_config.
2024-08-11 08:37:35 -04:00
ljleb
925fff26fd
alternative to `load_checkpoint_guess_config` that accepts a loaded state dict ( #4249 )
...
* make alternative fn
* add back ckpt path as 2nd argument?
2024-08-11 08:36:52 -04:00
comfyanonymous
75b9b55b22
Fix issues with #4302 and support loading diffusers format flux.
2024-08-10 21:28:24 -04:00
Jaret Burkett
1765f1c60c
FLUX: Added full diffusers mapping for FLUX.1 schnell and dev. Adds full LoRA support from diffusers LoRAs. ( #4302 )
2024-08-10 21:26:41 -04:00
comfyanonymous
1de69fe4d5
Fix some issues with inference slowing down.
2024-08-10 16:21:25 -04:00
comfyanonymous
ae197f651b
Speed up hunyuan dit inference a bit.
2024-08-10 07:36:27 -04:00
comfyanonymous
1b5b8ca81a
Fix regression.
2024-08-09 21:45:21 -04:00
comfyanonymous
6678d5cf65
Fix regression.
2024-08-09 14:02:38 -04:00
TTPlanetPig
e172564eea
Update controlnet.py to fix the default controlnet weight as constant ( #4285 )
2024-08-09 13:40:05 -04:00
comfyanonymous
a3cc326748
Better fix for lowvram issue.
2024-08-09 12:16:25 -04:00
comfyanonymous
86a97e91fc
Fix controlnet regression.
2024-08-09 12:08:58 -04:00
comfyanonymous
5acdadc9f3
Fix issue with some lowvram weights.
2024-08-09 03:58:28 -04:00
comfyanonymous
55ad9d5f8c
Fix regression.
2024-08-09 03:36:40 -04:00
comfyanonymous
a9f04edc58
Implement text encoder part of HunyuanDiT loras.
2024-08-09 03:21:10 -04:00
comfyanonymous
a475ec2300
Cleanup HunyuanDit controlnets.
...
Use the: ControlNetApply SD3 and HunyuanDiT node.
2024-08-09 02:59:34 -04:00
来新璐
06eb9fb426
feat: add support for HunYuanDit ControlNet ( #4245 )
...
* add support for HunYuanDit ControlNet
* fix hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix code format style
* add control_weight support for HunyuanDit Controlnet
* use control_weights in HunyuanDit Controlnet
* fix typo
2024-08-09 02:59:24 -04:00
comfyanonymous
413322645e
Raw torch is faster than einops?
2024-08-08 22:09:29 -04:00
comfyanonymous
11200de970
Cleaner code.
2024-08-08 20:07:09 -04:00
comfyanonymous
037c38eb0f
Try to improve inference speed on some machines.
2024-08-08 17:29:27 -04:00
comfyanonymous
1e11d2d1f5
Better prints.
2024-08-08 17:29:27 -04:00
Alex "mcmonkey" Goodwin
65ea6be38f
PullRequest CI Run: use pull_request_target to allow the CI Dashboard to work ( #4277 )
...
'_target' allows secrets to pass through, and we're just using the secret that allows uploading to the dashboard and are manually vetting PRs before running this workflow anyway
2024-08-08 17:20:48 -04:00
Alex "mcmonkey" Goodwin
5df6f57b5d
minor fix on copypasta action name ( #4276 )
...
my bad sorry
2024-08-08 16:30:59 -04:00
Alex "mcmonkey" Goodwin
6588bfdef9
add GitHub workflow for CI tests of PRs ( #4275 )
...
When the 'Run-CI-Test' label is added to a PR, it will be tested by the CI, on a small matrix of stable versions.
2024-08-08 16:24:49 -04:00
Alex "mcmonkey" Goodwin
50ed2879ef
Add full CI test matrix GitHub Workflow ( #4274 )
...
automatically runs a matrix of full GPU-enabled tests on all new commits to the ComfyUI master branch
2024-08-08 15:40:07 -04:00
comfyanonymous
66d4233210
Fix.
2024-08-08 15:16:51 -04:00
comfyanonymous
591010b7ef
Support diffusers text attention flux loras.
2024-08-08 14:45:52 -04:00
comfyanonymous
08f92d55e9
Partial model shift support.
2024-08-08 14:45:06 -04:00
comfyanonymous
8115d8cce9
Add Flux fp16 support hack.
2024-08-07 15:08:39 -04:00
comfyanonymous
6969fc9ba4
Make supported_dtypes a priority list.
2024-08-07 15:00:06 -04:00
comfyanonymous
cb7c4b4be3
Workaround for lora OOM on lowvram mode.
2024-08-07 14:30:54 -04:00
comfyanonymous
1208863eca
Fix "Comfy" lora keys.
...
They are in this format now:
diffusion_model.full.model.key.name.lora_up.weight
2024-08-07 13:49:31 -04:00
comfyanonymous
e1c528196e
Fix bundled embed.
2024-08-07 13:30:45 -04:00
comfyanonymous
17030fd4c0
Support for "Comfy" lora format.
...
The keys are just: model.full.model.key.name.lora_up.weight
It is supported by all comfyui supported models.
Now people can just convert loras to this format instead of having to ask
for me to implement them.
2024-08-07 13:18:32 -04:00
comfyanonymous
c19dcd362f
Controlnet code refactor.
2024-08-07 12:59:28 -04:00
comfyanonymous
1c08bf35b4
Support format for embeddings bundled in loras.
2024-08-07 03:45:25 -04:00