Commit Graph

234 Commits

Author SHA1 Message Date
comfyanonymous 94680732d3 Empty cache on mps. 2023-06-01 03:52:51 -04:00
comfyanonymous 03da8a3426 This is useless for inference. 2023-05-31 13:03:24 -04:00
comfyanonymous eb448dd8e1 Auto load model in lowvram if not enough memory. 2023-05-30 12:36:41 -04:00
comfyanonymous b9818eb910 Add route to get safetensors metadata:
/view_metadata/loras?filename=lora.safetensors
2023-05-29 02:48:50 -04:00
comfyanonymous a532888846 Support VAEs in diffusers format. 2023-05-28 02:02:09 -04:00
comfyanonymous 0fc483dcfd Refactor diffusers model convert code to be able to reuse it. 2023-05-28 01:55:40 -04:00
comfyanonymous eb4bd7711a Remove einops. 2023-05-25 18:42:56 -04:00
comfyanonymous 87ab25fac7 Do operations in same order as the one it replaces. 2023-05-25 18:31:27 -04:00
comfyanonymous 2b1fac9708 Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI 2023-05-25 14:44:16 -04:00
comfyanonymous e1278fa925 Support old pytorch versions that don't have weights_only. 2023-05-25 13:30:59 -04:00
BlenderNeko 8b4b0c3188 vecorized bislerp 2023-05-25 19:23:47 +02:00
comfyanonymous b8ccbec6d8 Various improvements to bislerp. 2023-05-23 11:40:24 -04:00
comfyanonymous 34887b8885 Add experimental bislerp algorithm for latent upscaling.
It's like bilinear but with slerp.
2023-05-23 03:12:56 -04:00
comfyanonymous 6cc450579b Auto transpose images from exif data. 2023-05-22 00:22:24 -04:00
comfyanonymous dc198650c0 sample_dpmpp_2m_sde no longer crashes when step == 1. 2023-05-21 11:34:29 -04:00
comfyanonymous 069657fbf3 Add DPM-Solver++(2M) SDE and exponential scheduler.
exponential scheduler is the one recommended with this sampler.
2023-05-21 01:46:03 -04:00
comfyanonymous b8636a44aa Make scaled_dot_product switch to sliced attention on OOM. 2023-05-20 16:01:02 -04:00
comfyanonymous 797c4e8d3b Simplify and improve some vae attention code. 2023-05-20 15:07:21 -04:00
comfyanonymous ef815ba1e2 Switch default scheduler to normal. 2023-05-15 00:29:56 -04:00
comfyanonymous 68d12b530e Merge branch 'tiled_sampler' of https://github.com/BlenderNeko/ComfyUI 2023-05-14 15:39:39 -04:00
comfyanonymous 3a1f47764d Print the torch device that is used on startup. 2023-05-13 17:11:27 -04:00
BlenderNeko 1201d2eae5
Make nodes map over input lists (#579)
* allow nodes to map over lists

* make work with IS_CHANGED and VALIDATE_INPUTS

* give list outputs distinct socket shape

* add rebatch node

* add batch index logic

* add repeat latent batch

* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
BlenderNeko 19c014f429 comment out annoying print statement 2023-05-12 23:57:40 +02:00
BlenderNeko d9e088ddfd minor changes for tiled sampler 2023-05-12 23:49:09 +02:00
comfyanonymous f7c0f75d1f Auto batching improvements.
Try batching when cond sizes don't match with smart padding.
2023-05-10 13:59:24 -04:00
comfyanonymous 314e526c5c Not needed anymore because sampling works with any latent size. 2023-05-09 12:18:18 -04:00
comfyanonymous c6e34963e4 Make t2i adapter work with any latent resolution. 2023-05-08 18:15:19 -04:00
comfyanonymous a1f12e370d Merge branch 'autostart' of https://github.com/EllangoK/ComfyUI 2023-05-07 17:19:03 -04:00
comfyanonymous 6fc4917634 Make maximum_batch_area take into account python2.0 attention function.
More conservative xformers maximum_batch_area.
2023-05-06 19:58:54 -04:00
comfyanonymous 678f933d38 maximum_batch_area for xformers.
Remove useless code.
2023-05-06 19:28:46 -04:00
EllangoK 8e03c789a2 auto-launch cli arg 2023-05-06 16:59:40 -04:00
comfyanonymous cb1551b819 Lowvram mode for gligen and fix some lowvram issues. 2023-05-05 18:11:41 -04:00
comfyanonymous af9cc1fb6a Search recursively in subfolders for embeddings. 2023-05-05 01:28:48 -04:00
comfyanonymous 6ee11d7bc0 Fix import. 2023-05-05 00:19:35 -04:00
comfyanonymous bae4fb4a9d Fix imports. 2023-05-04 18:10:29 -04:00
comfyanonymous fcf513e0b6 Refactor. 2023-05-03 17:48:35 -04:00
comfyanonymous a74e176a24 Merge branch 'tiled-progress' of https://github.com/pythongosssss/ComfyUI 2023-05-03 16:24:56 -04:00
pythongosssss 5eeecf3fd5 remove unused import 2023-05-03 18:21:23 +01:00
pythongosssss 8912623ea9 use comfy progress bar 2023-05-03 18:19:22 +01:00
comfyanonymous 908dc1d5a8 Add a total_steps value to sampler callback. 2023-05-03 12:58:10 -04:00
pythongosssss fdf57325f4 Merge remote-tracking branch 'origin/master' into tiled-progress 2023-05-03 17:33:42 +01:00
pythongosssss 27df74101e reduce duplication 2023-05-03 17:33:19 +01:00
comfyanonymous 93c64afaa9 Use sampler callback instead of tqdm hook for progress bar. 2023-05-02 23:00:49 -04:00
pythongosssss 06ad35b493 added progress to encode + upscale 2023-05-02 19:18:07 +01:00
comfyanonymous ba8a4c3667 Change latent resolution step to 8. 2023-05-02 14:17:51 -04:00
comfyanonymous 66c8aa5c3e Make unet work with any input shape. 2023-05-02 13:31:43 -04:00
comfyanonymous 9c335a553f LoKR support. 2023-05-01 18:18:23 -04:00
comfyanonymous d3293c8339 Properly disable all progress bars when disable_pbar=True 2023-05-01 15:52:17 -04:00
BlenderNeko a2e18b1504 allow disabling of progress bar when sampling 2023-04-30 18:59:58 +02:00
comfyanonymous 071011aebe Mask strength should be separate from area strength. 2023-04-29 20:06:53 -04:00