comfyanonymous
b4526d3fc3
Skip layer guidance now works on hydit model.
2024-11-24 05:54:30 -05:00
comfyanonymous
ab885b33ba
Skip layer guidance node now works on LTX-Video.
2024-11-23 10:33:05 -05:00
comfyanonymous
e5c3f4b87f
LTXV lowvram fixes.
2024-11-22 17:17:11 -05:00
comfyanonymous
5e16f1d24b
Support Lightricks LTX-Video model.
2024-11-22 08:46:39 -05:00
comfyanonymous
2fd9c1308a
Fix mask issue in some attention functions.
2024-11-22 02:10:09 -05:00
comfyanonymous
8f0009aad0
Support new flux model variants.
2024-11-21 08:38:23 -05:00
comfyanonymous
07f6eeaa13
Fix mask issue with attention_xformers.
2024-11-20 17:07:46 -05:00
comfyanonymous
22535d0589
Skip layer guidance now works on stable audio model.
2024-11-20 07:33:06 -05:00
comfyanonymous
d9f90965c8
Support block replace patches in auraflow.
2024-11-17 08:19:59 -05:00
comfyanonymous
41886af138
Add transformer options blocks replace patch to mochi.
2024-11-16 20:48:14 -05:00
comfyanonymous
8ebf2d8831
Add block replace transformer_options to flux.
2024-11-12 08:00:39 -05:00
contentis
69694f40b3
fix dynamic shape export ( #5490 )
2024-11-04 14:59:28 -05:00
comfyanonymous
fabf449feb
Mochi VAE encoder.
2024-11-01 17:33:09 -04:00
comfyanonymous
30c0c81351
Add a way to patch blocks in SD3.
2024-10-29 00:48:32 -04:00
comfyanonymous
13b0ff8a6f
Update SD3 code.
2024-10-28 21:58:52 -04:00
comfyanonymous
5cbb01bc2f
Basic Genmo Mochi video model support.
...
To use:
"Load CLIP" node with t5xxl + type mochi
"Load Diffusion Model" node with the mochi dit file.
"Load VAE" with the mochi vae file.
EmptyMochiLatentVideo node for the latent.
euler + linear_quadratic in the KSampler node.
2024-10-26 06:54:00 -04:00
contentis
5a8a48931a
remove attention abstraction ( #5324 )
2024-10-22 14:02:38 -04:00
comfyanonymous
d854ed0bcf
Allow using SD3 type te output on flux model.
2024-10-03 09:44:54 -04:00
City
8733191563
Flux torch.compile fix ( #5082 )
2024-09-27 22:07:51 -04:00
comfyanonymous
cf80d28689
Support loading controlnets with different input.
2024-09-13 09:54:37 -04:00
comfyanonymous
c27ebeb1c2
Fix onnx export not working on flux.
2024-09-06 03:21:52 -04:00
comfyanonymous
5cbaa9e07c
Mistoline flux controlnet support.
2024-09-05 00:05:17 -04:00
Jedrzej Kosinski
f04229b84d
Add emb_patch support to UNetModel forward ( #4779 )
2024-09-04 14:35:15 -04:00
comfyanonymous
63fafaef45
Fix potential issue with hydit controlnets.
2024-08-30 04:58:41 -04:00
comfyanonymous
10a79e9898
Implement model part of flux union controlnet.
2024-08-29 18:41:22 -04:00
comfyanonymous
ea3f39bd69
InstantX depth flux controlnet.
2024-08-29 02:14:19 -04:00
comfyanonymous
b33cd61070
InstantX canny controlnet.
2024-08-28 19:02:50 -04:00
comfyanonymous
d31e226650
Unify RMSNorm code.
2024-08-28 16:56:38 -04:00
comfyanonymous
ab130001a8
Do RMSNorm in native type.
2024-08-27 02:41:56 -04:00
comfyanonymous
015f73dc49
Try a different type of flux fp16 fix.
2024-08-21 16:17:15 -04:00
comfyanonymous
d31df04c8a
Indentation.
2024-08-17 23:00:44 -04:00
Xrvk
e68763f40c
Add Flux model support for InstantX style controlnet residuals ( #4444 )
...
* Add Flux model support for InstantX style controlnet residuals
* Refactor Flux controlnet residual step to a separate method
* Rollback minor change
* New format for applying controlnet residuals: input->double_blocks, output->single_blocks
* Adjust XLabs Flux controlnet to fit new syntax of applying Flux controlnet residuals
* Remove unnecessary import and minor style change
2024-08-17 22:58:23 -04:00
comfyanonymous
33fb282d5c
Fix issue.
2024-08-14 02:51:47 -04:00
comfyanonymous
a5af64d3ce
Revert "Not sure if this actually changes anything but it can't hurt."
...
This reverts commit 34608de2e9
.
2024-08-14 01:05:17 -04:00
comfyanonymous
34608de2e9
Not sure if this actually changes anything but it can't hurt.
2024-08-13 13:29:16 -04:00
comfyanonymous
5942c17d55
Order of operations matters.
2024-08-12 21:56:18 -04:00
comfyanonymous
c032b11e07
xlabs Flux controlnet implementation. ( #4260 )
...
* xlabs Flux controlnet.
* Fix not working on old python.
* Remove comment.
2024-08-12 21:22:22 -04:00
comfyanonymous
ae197f651b
Speed up hunyuan dit inference a bit.
2024-08-10 07:36:27 -04:00
comfyanonymous
a475ec2300
Cleanup HunyuanDit controlnets.
...
Use the: ControlNetApply SD3 and HunyuanDiT node.
2024-08-09 02:59:34 -04:00
来新璐
06eb9fb426
feat: add support for HunYuanDit ControlNet ( #4245 )
...
* add support for HunYuanDit ControlNet
* fix hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix typo in hunyuandit controlnet
* fix code format style
* add control_weight support for HunyuanDit Controlnet
* use control_weights in HunyuanDit Controlnet
* fix typo
2024-08-09 02:59:24 -04:00
comfyanonymous
413322645e
Raw torch is faster than einops?
2024-08-08 22:09:29 -04:00
comfyanonymous
11200de970
Cleaner code.
2024-08-08 20:07:09 -04:00
comfyanonymous
8115d8cce9
Add Flux fp16 support hack.
2024-08-07 15:08:39 -04:00
a-One-Fan
a178e25912
Fix Flux FP64 math on XPU ( #4210 )
2024-08-05 01:26:20 -04:00
comfyanonymous
3b71f84b50
ONNX tracing fixes.
2024-08-04 15:45:43 -04:00
comfyanonymous
e638f2858a
Hack to make all resolutions work on Flux models.
2024-08-01 21:39:18 -04:00
comfyanonymous
48eb1399c0
Try to fix mac issue.
2024-08-01 13:41:27 -04:00
comfyanonymous
f2b80f95d2
Better Mac support on flux model.
2024-08-01 13:10:50 -04:00
comfyanonymous
8d34211a7a
Fix old python versions no longer working.
2024-08-01 09:57:20 -04:00
comfyanonymous
1589b58d3e
Basic Flux Schnell and Flux Dev model implementation.
2024-08-01 09:49:29 -04:00