comfyanonymous
|
056e5545ff
|
Don't try to get vram from xpu or cuda when directml is enabled.
|
2023-04-29 00:28:48 -04:00 |
comfyanonymous
|
2ca934f7d4
|
You can now select the device index with: --directml id
Like this for example: --directml 1
|
2023-04-28 16:51:35 -04:00 |
comfyanonymous
|
3baded9892
|
Basic torch_directml support. Use --directml to use it.
|
2023-04-28 14:28:57 -04:00 |
comfyanonymous
|
5a971cecdb
|
Add callback to sampler function.
Callback format is: callback(step, x0, x)
|
2023-04-27 04:38:44 -04:00 |
comfyanonymous
|
aa57136dae
|
Some fixes to the batch masks PR.
|
2023-04-25 01:12:40 -04:00 |
comfyanonymous
|
c50208a703
|
Refactor more code to sample.py
|
2023-04-24 23:25:51 -04:00 |
comfyanonymous
|
7983b3a975
|
This is cleaner this way.
|
2023-04-24 22:45:35 -04:00 |
BlenderNeko
|
0b07b2cc0f
|
gligen tuple
|
2023-04-24 21:47:57 +02:00 |
BlenderNeko
|
d9b1595f85
|
made sample functions more explicit
|
2023-04-24 12:53:10 +02:00 |
BlenderNeko
|
5818539743
|
add docstrings
|
2023-04-23 20:09:09 +02:00 |
BlenderNeko
|
8d2de420d3
|
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
|
2023-04-23 20:02:18 +02:00 |
BlenderNeko
|
2a09e2aa27
|
refactor/split various bits of code for sampling
|
2023-04-23 20:02:08 +02:00 |
comfyanonymous
|
5282f56434
|
Implement Linear hypernetworks.
Add a HypernetworkLoader node to use hypernetworks.
|
2023-04-23 12:35:25 -04:00 |
comfyanonymous
|
6908f9c949
|
This makes pytorch2.0 attention perform a bit faster.
|
2023-04-22 14:30:39 -04:00 |
comfyanonymous
|
907010e082
|
Remove some useless code.
|
2023-04-20 23:58:25 -04:00 |
comfyanonymous
|
96b57a9ad6
|
Don't pass adm to model when it doesn't support it.
|
2023-04-19 21:11:38 -04:00 |
comfyanonymous
|
3696d1699a
|
Add support for GLIGEN textbox model.
|
2023-04-19 11:06:32 -04:00 |
comfyanonymous
|
884ea653c8
|
Add a way for nodes to set a custom CFG function.
|
2023-04-17 11:05:15 -04:00 |
comfyanonymous
|
73c3e11e83
|
Fix model_management import so it doesn't get executed twice.
|
2023-04-15 19:04:33 -04:00 |
comfyanonymous
|
81d1f00df3
|
Some refactoring: from_tokens -> encode_from_tokens
|
2023-04-15 18:46:58 -04:00 |
comfyanonymous
|
719c26c3c9
|
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
|
2023-04-15 14:16:50 -04:00 |
BlenderNeko
|
d0b1b6c6bf
|
fixed improper padding
|
2023-04-15 19:38:21 +02:00 |
comfyanonymous
|
deb2b93e79
|
Move code to empty gpu cache to model_management.py
|
2023-04-15 11:19:07 -04:00 |
comfyanonymous
|
04d9bc13af
|
Safely load pickled embeds that don't load with weights_only=True.
|
2023-04-14 15:33:43 -04:00 |
BlenderNeko
|
da115bd78d
|
ensure backwards compat with optional args
|
2023-04-14 21:16:55 +02:00 |
BlenderNeko
|
752f7a162b
|
align behavior with old tokenize function
|
2023-04-14 21:02:45 +02:00 |
comfyanonymous
|
334aab05e5
|
Don't stop workflow if loading embedding fails.
|
2023-04-14 13:54:00 -04:00 |
BlenderNeko
|
73175cf58c
|
split tokenizer from encoder
|
2023-04-13 22:06:50 +02:00 |
BlenderNeko
|
8489cba140
|
add unique ID per word/embedding for tokenizer
|
2023-04-13 22:01:01 +02:00 |
comfyanonymous
|
92eca60ec9
|
Fix for new transformers version.
|
2023-04-09 15:55:21 -04:00 |
comfyanonymous
|
1e1875f674
|
Print xformers version and warning about 0.0.18
|
2023-04-09 01:31:47 -04:00 |
comfyanonymous
|
7e254d2f69
|
Clarify what --windows-standalone-build does.
|
2023-04-07 15:52:56 -04:00 |
comfyanonymous
|
44fea05064
|
Cleanup.
|
2023-04-07 02:31:46 -04:00 |
comfyanonymous
|
58ed0f2da4
|
Fix loading SD1.5 diffusers checkpoint.
|
2023-04-07 01:30:33 -04:00 |
comfyanonymous
|
8b9ac8fedb
|
Merge branch 'master' of https://github.com/sALTaccount/ComfyUI
|
2023-04-07 01:03:43 -04:00 |
comfyanonymous
|
64557d6781
|
Add a --force-fp32 argument to force fp32 for debugging.
|
2023-04-07 00:27:54 -04:00 |
comfyanonymous
|
bceccca0e5
|
Small refactor.
|
2023-04-06 23:53:54 -04:00 |
comfyanonymous
|
28a7205739
|
Merge branch 'ipex' of https://github.com/kwaa/ComfyUI-IPEX
|
2023-04-06 23:45:29 -04:00 |
藍+85CD
|
05eeaa2de5
|
Merge branch 'master' into ipex
|
2023-04-07 09:11:30 +08:00 |
EllangoK
|
28fff5d1db
|
fixes lack of support for multi configs
also adds some metavars to argarse
|
2023-04-06 19:06:39 -04:00 |
comfyanonymous
|
f84f2508cc
|
Rename the cors parameter to something more verbose.
|
2023-04-06 15:24:55 -04:00 |
EllangoK
|
48efae1608
|
makes cors a cli parameter
|
2023-04-06 15:06:22 -04:00 |
EllangoK
|
01c1fc669f
|
set listen flag to listen on all if specifed
|
2023-04-06 13:19:00 -04:00 |
藍+85CD
|
3e2608e12b
|
Fix auto lowvram detection on CUDA
|
2023-04-06 15:44:05 +08:00 |
sALTaccount
|
60127a8304
|
diffusers loader
|
2023-04-05 23:57:31 -07:00 |
藍+85CD
|
7cb924f684
|
Use separate variables instead of `vram_state`
|
2023-04-06 14:24:47 +08:00 |
藍+85CD
|
84b9c0ac2f
|
Import intel_extension_for_pytorch as ipex
|
2023-04-06 12:27:22 +08:00 |
EllangoK
|
e5e587b1c0
|
seperates out arg parser and imports args
|
2023-04-05 23:41:23 -04:00 |
藍+85CD
|
37713e3b0a
|
Add basic XPU device support
closed #387
|
2023-04-05 21:22:14 +08:00 |
comfyanonymous
|
e46b1c3034
|
Disable xformers in VAE when xformers == 0.0.18
|
2023-04-04 22:22:02 -04:00 |