I confirmed CPU only inference works fine on a pytorch without cuda.

This commit is contained in:
comfyanonymous 2023-03-09 11:40:24 -05:00
parent 4ed39cf038
commit 9ce300ab59
1 changed files with 2 additions and 1 deletions

View File

@ -12,7 +12,8 @@ This ui will let you design and execute advanced stable diffusion pipelines usin
- Fully supports SD1.x and SD2.x - Fully supports SD1.x and SD2.x
- Asynchronous Queue system - Asynchronous Queue system
- Many optimizations: Only re-executes the parts of the workflow that changes between executions. - Many optimizations: Only re-executes the parts of the workflow that changes between executions.
- Command line option: ```--lowvram``` to make it work on GPUs with less than 3GB vram. - Command line option: ```--lowvram``` to make it work on GPUs with less than 3GB vram (enabled automatically on GPUs with low vram)
- Works even if you don't have a GPU with: ```--cpu``` (slow)
- Can load both ckpt and safetensors models/checkpoints. Standalone VAEs and CLIP models. - Can load both ckpt and safetensors models/checkpoints. Standalone VAEs and CLIP models.
- Embeddings/Textual inversion - Embeddings/Textual inversion
- Loras - Loras