comfyanonymous
|
8ce2a1052c
|
Optimizations to --fast and scaled fp8.
|
2024-10-22 02:12:28 -04:00 |
comfyanonymous
|
f82314fcfc
|
Fix duplicate sigmas on beta scheduler.
|
2024-10-21 20:19:45 -04:00 |
comfyanonymous
|
0075c6d096
|
Mixed precision diffusion models with scaled fp8.
This change allows supports for diffusion models where all the linears are
scaled fp8 while the other weights are the original precision.
|
2024-10-21 18:12:51 -04:00 |
comfyanonymous
|
83ca891118
|
Support scaled fp8 t5xxl model.
|
2024-10-20 22:27:00 -04:00 |
comfyanonymous
|
f9f9faface
|
Fixed model merging issue with scaled fp8.
|
2024-10-20 06:24:31 -04:00 |
comfyanonymous
|
471cd3eace
|
fp8 casting is fast on GPUs that support fp8 compute.
|
2024-10-20 00:54:47 -04:00 |
comfyanonymous
|
a68bbafddb
|
Support diffusion models with scaled fp8 weights.
|
2024-10-19 23:47:42 -04:00 |
comfyanonymous
|
73e3a9e676
|
Clamp output when rounding weight to prevent Nan.
|
2024-10-19 19:07:10 -04:00 |
comfyanonymous
|
67158994a4
|
Use the lowvram cast_to function for everything.
|
2024-10-17 17:25:56 -04:00 |
comfyanonymous
|
0bedfb26af
|
Revert "Fix Transformers FutureWarning (#5140)"
This reverts commit 95b7cf9bbe .
|
2024-10-16 12:36:19 -04:00 |
comfyanonymous
|
f584758271
|
Cleanup some useless lines.
|
2024-10-14 21:02:39 -04:00 |
svdc
|
95b7cf9bbe
|
Fix Transformers FutureWarning (#5140)
* Update sd1_clip.py
Fix Transformers FutureWarning
* Update sd1_clip.py
Fix comment
|
2024-10-14 20:12:20 -04:00 |
comfyanonymous
|
3c60ecd7a8
|
Fix fp8 ops staying enabled.
|
2024-10-12 14:10:13 -04:00 |
comfyanonymous
|
7ae6626723
|
Remove useless argument.
|
2024-10-12 07:16:21 -04:00 |
comfyanonymous
|
6632365e16
|
model_options consistency between functions.
weight_dtype -> dtype
|
2024-10-11 20:51:19 -04:00 |
Kadir Nar
|
ad07796777
|
🐛 Add device to variable c (#5210)
|
2024-10-11 20:37:50 -04:00 |
comfyanonymous
|
1b80895285
|
Make clip loader nodes support loading sd3 t5xxl in lower precision.
Add attention mask support in the SD3 text encoder code.
|
2024-10-10 15:06:15 -04:00 |
Dr.Lt.Data
|
5f9d5a244b
|
Hotfix for the div zero occurrence when memory_used_encode is 0 (#5121)
https://github.com/comfyanonymous/ComfyUI/issues/5069#issuecomment-2382656368
|
2024-10-09 23:34:34 -04:00 |
Jonathan Avila
|
4b2f0d9413
|
Increase maximum macOS version to 15.0.1 when forcing upcast attention (#5191)
|
2024-10-09 22:21:41 -04:00 |
comfyanonymous
|
e38c94228b
|
Add a weight_dtype fp8_e4m3fn_fast to the Diffusion Model Loader node.
This is used to load weights in fp8 and use fp8 matrix multiplication.
|
2024-10-09 19:43:17 -04:00 |
comfyanonymous
|
203942c8b2
|
Fix flux doras with diffusers keys.
|
2024-10-08 19:03:40 -04:00 |
comfyanonymous
|
8dfa0cc552
|
Make SD3 fast previews a little better.
|
2024-10-07 09:19:59 -04:00 |
comfyanonymous
|
e5ecdfdd2d
|
Make fast previews for SDXL a little better by adding a bias.
|
2024-10-06 19:27:04 -04:00 |
comfyanonymous
|
7d29fbf74b
|
Slightly improve the fast previews for flux by adding a bias.
|
2024-10-06 17:55:46 -04:00 |
comfyanonymous
|
7d2467e830
|
Some minor cleanups.
|
2024-10-05 13:22:39 -04:00 |
comfyanonymous
|
6f021d8aa0
|
Let --verbose have an argument for the log level.
|
2024-10-04 10:05:34 -04:00 |
comfyanonymous
|
d854ed0bcf
|
Allow using SD3 type te output on flux model.
|
2024-10-03 09:44:54 -04:00 |
comfyanonymous
|
abcd006b8c
|
Allow more permutations of clip/t5 in dual clip loader.
|
2024-10-03 09:26:11 -04:00 |
comfyanonymous
|
d985d1d7dc
|
CLIP Loader node now supports clip_l and clip_g only for SD3.
|
2024-10-02 04:25:17 -04:00 |
comfyanonymous
|
d1cdf51e1b
|
Refactor some of the TE detection code.
|
2024-10-01 07:08:41 -04:00 |
comfyanonymous
|
b4626ab93e
|
Add simpletuner lycoris format for SD unet.
|
2024-09-30 06:03:27 -04:00 |
comfyanonymous
|
a9e459c2a4
|
Use torch.nn.functional.linear in RGB preview code.
Add an optional bias to the latent RGB preview code.
|
2024-09-29 11:27:49 -04:00 |
comfyanonymous
|
3bb4dec720
|
Fix issue with loras, lowvram and --fast fp8.
|
2024-09-28 14:42:32 -04:00 |
City
|
8733191563
|
Flux torch.compile fix (#5082)
|
2024-09-27 22:07:51 -04:00 |
comfyanonymous
|
bdd4a22a2e
|
Fix flux TE not loading t5 embeddings.
|
2024-09-24 22:57:22 -04:00 |
chaObserv
|
479a427a48
|
Add dpmpp_2m_cfg_pp (#4992)
|
2024-09-24 02:42:56 -04:00 |
comfyanonymous
|
3a0eeee320
|
Make --listen listen on both ipv4 and ipv6 at the same time by default.
|
2024-09-23 04:38:19 -04:00 |
comfyanonymous
|
9c41bc8d10
|
Remove useless line.
|
2024-09-23 02:32:29 -04:00 |
comfyanonymous
|
7a415f47a9
|
Add an optional VAE input to the ControlNetApplyAdvanced node.
Deprecate the other controlnet nodes.
|
2024-09-22 01:24:52 -04:00 |
comfyanonymous
|
dc96a1ae19
|
Load controlnet in fp8 if weights are in fp8.
|
2024-09-21 04:50:12 -04:00 |
comfyanonymous
|
2d810b081e
|
Add load_controlnet_state_dict function.
|
2024-09-21 01:51:51 -04:00 |
comfyanonymous
|
9f7e9f0547
|
Add an error message when a controlnet needs a VAE but none is given.
|
2024-09-21 01:33:18 -04:00 |
comfyanonymous
|
70a708d726
|
Fix model merging issue.
|
2024-09-20 02:31:44 -04:00 |
yoinked
|
e7d4782736
|
add laplace scheduler [2407.03297] (#4990)
* add laplace scheduler [2407.03297]
* should be here instead lol
* better settings
|
2024-09-19 23:23:09 -04:00 |
comfyanonymous
|
ad66f7c7d8
|
Add model_options to load_controlnet function.
|
2024-09-19 08:23:35 -04:00 |
Simon Lui
|
de8e8e3b0d
|
Fix xpu Pytorch nightly build from calling optimize which doesn't exist. (#4978)
|
2024-09-19 05:11:42 -04:00 |
pharmapsychotic
|
0b7dfa986d
|
Improve tiling calculations to reduce number of tiles that need to be processed. (#4944)
|
2024-09-17 03:51:10 -04:00 |
comfyanonymous
|
d514bb38ee
|
Add some option to model_options for the text encoder.
load_device, offload_device and the initial_device can now be set.
|
2024-09-17 03:49:54 -04:00 |
comfyanonymous
|
0849c80e2a
|
get_key_patches now works without unloading the model.
|
2024-09-17 01:57:59 -04:00 |
comfyanonymous
|
e813abbb2c
|
Long CLIP L support for SDXL, SD3 and Flux.
Use the *CLIPLoader nodes.
|
2024-09-15 07:59:38 -04:00 |