comfyanonymous
4760c29380
Merge branch 'fix-AttributeError-module-'torch'-has-no-attribute-'mps'' of https://github.com/KarryCharon/ComfyUI
2023-07-20 00:34:54 -04:00
comfyanonymous
ccb6b70de1
Move image encoding outside of sampling loop for better preview perf.
2023-07-19 18:06:58 -04:00
comfyanonymous
39c58b227f
Disable cuda malloc on GTX 750 Ti.
2023-07-19 15:14:10 -04:00
comfyanonymous
d5c0765f4e
Update how to get the prompt in api format in the example.
2023-07-19 15:07:12 -04:00
comfyanonymous
799c08a4ce
Auto disable cuda malloc on some GPUs on windows.
2023-07-19 14:43:55 -04:00
comfyanonymous
0b284f650b
Fix typo.
2023-07-19 10:20:32 -04:00
comfyanonymous
e032ca6138
Fix ddim issue with older torch versions.
2023-07-19 10:16:00 -04:00
comfyanonymous
18885f803a
Add MX450 and MX550 to list of cards with broken fp16.
2023-07-19 03:08:30 -04:00
comfyanonymous
9ba440995a
It's actually possible to torch.compile the unet now.
2023-07-18 21:36:35 -04:00
comfyanonymous
51d5477579
Add key to indicate checkpoint is v_prediction when saving.
2023-07-18 00:25:53 -04:00
comfyanonymous
ff6b047a74
Fix device print on old torch version.
2023-07-17 15:18:58 -04:00
comfyanonymous
9871a15cf9
Enable --cuda-malloc by default on torch 2.0 and up.
...
Add --disable-cuda-malloc to disable it.
2023-07-17 15:12:10 -04:00
comfyanonymous
55d0fca9fa
--windows-standalone-build now enables --cuda-malloc
2023-07-17 14:10:36 -04:00
comfyanonymous
1679abd86d
Add a command line argument to enable backend:cudaMallocAsync
2023-07-17 11:00:14 -04:00
comfyanonymous
3a150bad15
Only calculate randn in some samplers when it's actually being used.
2023-07-17 10:11:08 -04:00
comfyanonymous
ee8f8ee07f
Fix regression with ddim and uni_pc when batch size > 1.
2023-07-17 09:35:19 -04:00
comfyanonymous
3ded1a3a04
Refactor of sampler code to deal more easily with different model types.
2023-07-17 01:22:12 -04:00
comfyanonymous
ac9c038ac2
Merge branch 'master' of https://github.com/ComfyUI-Community/ComfyUI
2023-07-16 03:04:45 -04:00
comfyanonymous
5f57362613
Lower lora ram usage when in normal vram mode.
2023-07-16 02:59:04 -04:00
ComfyUI-Community
a8f3bbc35d
Patch del self.loaded_lora to prevent error with persistent lora_name swapping
2023-07-15 17:11:12 -07:00
comfyanonymous
490771b7f4
Speed up lora loading a bit.
2023-07-15 13:25:22 -04:00
comfyanonymous
50b1180dde
Fix CLIPSetLastLayer not reverting when removed.
2023-07-15 01:41:21 -04:00
comfyanonymous
6fb084f39d
Reduce floating point rounding errors in loras.
2023-07-15 00:53:00 -04:00
comfyanonymous
91ed2815d5
Add a node to merge CLIP models.
2023-07-14 02:41:18 -04:00
comfyanonymous
907c9fbf0d
Refactor to make it easier to set the api path.
2023-07-14 00:50:49 -04:00
comfyanonymous
30ea187160
Merge branch 'use-relative-paths' of https://github.com/mcmonkey4eva/ComfyUI
2023-07-13 23:56:29 -04:00
comfyanonymous
eed3042830
Move conditioning concat node to conditioning section.
2023-07-13 21:44:56 -04:00
comfyanonymous
8a577966c5
Enables a way to save workflows in api format in frontend.
...
Enable the dev mode in the settings to see it.
2023-07-13 21:08:54 -04:00
comfyanonymous
bdba394290
Add a canny preprocessor node.
2023-07-13 13:26:48 -04:00
comfyanonymous
6f914fb77d
Print prestartup times for custom nodes.
2023-07-13 13:01:45 -04:00
comfyanonymous
3bc8be33e4
Don't let custom nodes overwrite base nodes.
2023-07-13 12:56:38 -04:00
comfyanonymous
876dadca84
Highlight nodes with errors in red even when workflow works fine.
2023-07-13 10:07:50 -04:00
comfyanonymous
b2f03164c7
Prevent the clip_g position_ids key from being saved in the checkpoint.
...
This is to make it match the official checkpoint.
2023-07-12 20:15:02 -04:00
comfyanonymous
46dc050c9f
Fix potential tensors being on different devices issues.
2023-07-12 19:29:27 -04:00
comfyanonymous
90aa597099
Add back roundRect to fix issue on firefox ESR.
2023-07-12 02:07:48 -04:00
KarryCharon
3e2309f149
fix mps miss import
2023-07-12 10:06:34 +08:00
comfyanonymous
f4b9390623
Add a random string to the temp prefix for PreviewImage.
2023-07-11 17:35:55 -04:00
comfyanonymous
2b2a1474f7
Move to litegraph.
2023-07-11 03:12:00 -04:00
comfyanonymous
cef30cc6b6
Merge branch 'hidpi-canvas' of https://github.com/EHfive/ComfyUI
2023-07-11 03:04:10 -04:00
comfyanonymous
880c9b928b
Update litegraph to latest.
2023-07-11 03:00:52 -04:00
Huang-Huang Bao
05e6eac7b3
Scale graph canvas based on DPI factor
...
Similar to fixes in litegraph.js editor demo:
3ef215cf11/editor/js/code.js (L19-L28)
Also workarounds to address viewpoint problem of lightgrapgh.js in DPI scaling scenario.
Fixes #161
2023-07-11 14:47:58 +08:00
Dr.Lt.Data
99abcbef41
feat/startup-script: Feature to avoid package installation errors when installing custom nodes. ( #856 )
...
* support startup script for installation without locking on windows
* modified: Instead of executing scripts from the startup-scripts directory, I will change it to execute the prestartup_script.py for each custom node.
2023-07-11 02:33:21 -04:00
comfyanonymous
606a537090
Support SDXL embedding format with 2 CLIP.
2023-07-10 10:34:59 -04:00
Alex "mcmonkey" Goodwin
5797ff89b0
use relative paths for all web connections
...
This enables local reverse-proxies to host ComfyUI on a path, eg "http://example.com/ComfyUI/ " in such a way that at least everything I tested works. Without this patch, proxying ComfyUI in this way will yield errors.
2023-07-10 02:09:03 -07:00
comfyanonymous
6ad0a6d7e2
Don't patch weights when multiplier is zero.
2023-07-09 17:46:56 -04:00
comfyanonymous
af15add967
Fix annoyance with textbox unselecting in chromium.
2023-07-09 15:41:19 -04:00
comfyanonymous
d5323d16e0
latent2rgb matrix for SDXL.
2023-07-09 13:59:09 -04:00
comfyanonymous
0ae81c03bb
Empty cache after model unloading for normal vram and lower.
2023-07-09 09:56:03 -04:00
comfyanonymous
d3f5998218
Support loading clip_g from diffusers in CLIP Loader nodes.
2023-07-09 09:33:53 -04:00
comfyanonymous
a9a4ba7574
Fix merging not working when model2 of model merge node was a merge.
2023-07-08 22:31:10 -04:00