comfyanonymous
|
608fcc2591
|
Fix bug with weights when prompt is long.
|
2023-07-06 02:43:40 -04:00 |
comfyanonymous
|
ce35d8c659
|
Lower latency by batching some text encoder inputs.
|
2023-07-01 15:07:39 -04:00 |
comfyanonymous
|
b6a60fa696
|
Try to keep text encoders loaded and patched to increase speed.
load_model_gpu() is now used with the text encoder models instead of just
the unet.
|
2023-07-01 13:28:07 -04:00 |
comfyanonymous
|
97ee230682
|
Make highvram and normalvram shift the text encoders to vram and back.
This is faster on big text encoder models than running it on the CPU.
|
2023-07-01 12:37:23 -04:00 |
comfyanonymous
|
9920367d3c
|
Fix embeddings not working with --gpu-only
|
2023-06-29 20:43:06 -04:00 |
comfyanonymous
|
20f579d91d
|
Add DualClipLoader to load clip models for SDXL.
Update LoadClip to load clip models for SDXL refiner.
|
2023-06-25 01:40:38 -04:00 |
comfyanonymous
|
f87ec10a97
|
Support base SDXL and SDXL refiner models.
Large refactor of the model detection and loading code.
|
2023-06-22 13:03:50 -04:00 |
comfyanonymous
|
f7edcfd927
|
Add a --gpu-only argument to keep and run everything on the GPU.
Make the CLIP model work on the GPU.
|
2023-06-15 15:38:52 -04:00 |
comfyanonymous
|
bb1f45d6e8
|
Properly disable weight initialization in clip models.
|
2023-06-14 20:13:08 -04:00 |
comfyanonymous
|
0c7cad404c
|
Don't initialize clip weights to default values.
|
2023-06-14 12:47:36 -04:00 |
comfyanonymous
|
23cf8ca7c5
|
Fix bug when embedding gets ignored because of mismatched size.
|
2023-06-08 23:48:14 -04:00 |
comfyanonymous
|
af9cc1fb6a
|
Search recursively in subfolders for embeddings.
|
2023-05-05 01:28:48 -04:00 |
comfyanonymous
|
81d1f00df3
|
Some refactoring: from_tokens -> encode_from_tokens
|
2023-04-15 18:46:58 -04:00 |
comfyanonymous
|
719c26c3c9
|
Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI
|
2023-04-15 14:16:50 -04:00 |
BlenderNeko
|
d0b1b6c6bf
|
fixed improper padding
|
2023-04-15 19:38:21 +02:00 |
comfyanonymous
|
04d9bc13af
|
Safely load pickled embeds that don't load with weights_only=True.
|
2023-04-14 15:33:43 -04:00 |
BlenderNeko
|
da115bd78d
|
ensure backwards compat with optional args
|
2023-04-14 21:16:55 +02:00 |
BlenderNeko
|
752f7a162b
|
align behavior with old tokenize function
|
2023-04-14 21:02:45 +02:00 |
comfyanonymous
|
334aab05e5
|
Don't stop workflow if loading embedding fails.
|
2023-04-14 13:54:00 -04:00 |
BlenderNeko
|
8489cba140
|
add unique ID per word/embedding for tokenizer
|
2023-04-13 22:01:01 +02:00 |
comfyanonymous
|
1718730e80
|
Ignore embeddings when sizes don't match and print a WARNING.
|
2023-04-04 11:49:29 -04:00 |
comfyanonymous
|
50099bcd96
|
Support multiple paths for embeddings.
|
2023-03-18 03:08:43 -04:00 |
comfyanonymous
|
00a9189e30
|
Support old pytorch.
|
2023-02-19 16:59:03 -05:00 |
comfyanonymous
|
137ae2606c
|
Support people putting commas after the embedding name in the prompt.
|
2023-02-19 02:50:48 -05:00 |
comfyanonymous
|
324273fff2
|
Fix embedding not working when on new line.
|
2023-02-09 14:12:02 -05:00 |
comfyanonymous
|
f73e57d881
|
Add support for textual inversion embedding for SD1.x CLIP.
|
2023-01-29 18:46:44 -05:00 |
comfyanonymous
|
220afe3310
|
Initial commit.
|
2023-01-16 22:37:14 -05:00 |