comfyanonymous
|
4215206281
|
Add a node to set CLIP skip.
Use a more simple way to detect if the model is -v prediction.
|
2023-03-03 13:04:36 -05:00 |
comfyanonymous
|
94bb0375b0
|
New CheckpointLoaderSimple to load checkpoints without a config.
|
2023-03-03 03:37:35 -05:00 |
comfyanonymous
|
c1f5855ac1
|
Make some cross attention functions work on the CPU.
|
2023-03-03 03:27:33 -05:00 |
comfyanonymous
|
1a612e1c74
|
Add some pytorch scaled_dot_product_attention code for testing.
--use-pytorch-cross-attention to use it.
|
2023-03-02 17:01:20 -05:00 |
comfyanonymous
|
9502ee45c3
|
Hopefully fix a strange issue with xformers + lowvram.
|
2023-02-28 13:48:52 -05:00 |
comfyanonymous
|
fcb25d37db
|
Prepare for t2i adapter.
|
2023-02-24 23:36:17 -05:00 |
comfyanonymous
|
f04dc2c2f4
|
Implement DDIM sampler.
|
2023-02-22 21:10:19 -05:00 |
comfyanonymous
|
c9daec4c89
|
Remove prints that are useless when xformers is enabled.
|
2023-02-21 22:16:13 -05:00 |
comfyanonymous
|
09f1d76ed8
|
Fix an OOM issue.
|
2023-02-17 16:21:01 -05:00 |
comfyanonymous
|
4efa67fa12
|
Add ControlNet support.
|
2023-02-16 10:38:08 -05:00 |
comfyanonymous
|
1a4edd19cd
|
Fix overflow issue with inplace softmax.
|
2023-02-10 11:47:41 -05:00 |
comfyanonymous
|
509c7dfc6d
|
Use real softmax in split op to fix issue with some images.
|
2023-02-10 03:13:49 -05:00 |
comfyanonymous
|
1f6a467e92
|
Update ldm dir with latest upstream stable diffusion changes.
|
2023-02-09 13:47:36 -05:00 |
comfyanonymous
|
773cdabfce
|
Same thing but for the other places where it's used.
|
2023-02-09 12:43:29 -05:00 |
comfyanonymous
|
df40d4f3bf
|
torch.cuda.OutOfMemoryError is not present on older pytorch versions.
|
2023-02-09 12:33:27 -05:00 |
comfyanonymous
|
e8c499ddd4
|
Split optimization for VAE attention block.
|
2023-02-08 22:04:20 -05:00 |
comfyanonymous
|
5b4e312749
|
Use inplace operations for less OOM issues.
|
2023-02-08 22:04:13 -05:00 |
comfyanonymous
|
047775615b
|
Lower the chances of an OOM.
|
2023-02-08 14:24:27 -05:00 |
comfyanonymous
|
1daccf3678
|
Run softmax in place if it OOMs.
|
2023-01-30 19:55:01 -05:00 |
comfyanonymous
|
50db297cf6
|
Try to fix OOM issues with cards that have less vram than mine.
|
2023-01-29 00:50:46 -05:00 |
comfyanonymous
|
051f472e8f
|
Fix sub quadratic attention for SD2 and make it the default optimization.
|
2023-01-25 01:22:43 -05:00 |
comfyanonymous
|
220afe3310
|
Initial commit.
|
2023-01-16 22:37:14 -05:00 |