space-nuko
73e85fb3f4
Improve error output for failed nodes
2023-05-27 21:06:07 -05:00
comfyanonymous
48fcc5b777
Parsing error crash.
2023-05-22 20:51:30 -04:00
comfyanonymous
ffc56c53c9
Add a node_errors to the /prompt error json response.
...
"node_errors" contains a dict keyed by node ids. The contents are a message
and a list of dependent outputs.
2023-05-22 13:22:38 -04:00
comfyanonymous
516119ad83
Print min and max values in validation error message.
2023-05-21 00:24:28 -04:00
comfyanonymous
1dd846a7ba
Fix outputs gone from history.
2023-05-15 00:27:28 -04:00
comfyanonymous
9bf67c4c5a
Print prompt execution time.
2023-05-14 01:34:25 -04:00
comfyanonymous
44f9f9baf1
Add the prompt id to some websocket messages.
2023-05-13 11:17:16 -04:00
BlenderNeko
1201d2eae5
Make nodes map over input lists ( #579 )
...
* allow nodes to map over lists
* make work with IS_CHANGED and VALIDATE_INPUTS
* give list outputs distinct socket shape
* add rebatch node
* add batch index logic
* add repeat latent batch
* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
comfyanonymous
dfc74c19d9
Add the prompt_id to some websocket messages.
2023-05-11 01:22:40 -04:00
comfyanonymous
3a7c3acc72
Send websocket message with list of cached nodes right before execution.
2023-05-10 15:59:24 -04:00
comfyanonymous
602095f614
Send execution_error message on websocket on execution exception.
2023-05-10 15:49:49 -04:00
comfyanonymous
d6dee8af1d
Only validate each input once.
2023-05-10 00:29:31 -04:00
comfyanonymous
02ca1c67f8
Don't print traceback when processing interrupted.
2023-05-09 23:51:52 -04:00
comfyanonymous
3a1f9dba20
If IS_CHANGED returns exception delete the output instead of crashing.
2023-04-26 02:13:56 -04:00
comfyanonymous
951c0c2bbe
Don't keep cached outputs for removed nodes.
2023-04-26 02:05:57 -04:00
comfyanonymous
0ac319fd81
Don't delete all outputs when execution gets interrupted.
2023-04-23 22:44:38 -04:00
comfyanonymous
ccad603b2e
Add a way for nodes to validate their own inputs.
2023-04-23 16:03:26 -04:00
ltdrdata
f7a8218814
Add clipspace feature. ( #541 )
...
* Add clipspace feature.
* feat: copy content to clipspace
* feat: paste content from clipspace
Extend validation to allow for validating annotated_path in addition to other parameters.
Add support for annotated_filepath in folder_paths function.
Generalize the '/upload/image' API to allow for uploading images to the 'input', 'temp', or 'output' directories.
* rename contentClipboard -> clipspace
* Do deep copy for imgs on copy to clipspace.
* add original_imgs into clipspace
* Preserve the original image when 'imgs' are modified
* robust patch & refactoring folder_paths about annotated_filepath
* Only show the Paste menu if the ComfyApp.clipspace is not empty
* instant refresh on paste
force triggering 'changed' on paste action
* subfolder fix on paste logic
attach subfolder if subfolder isn't empty
---------
Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>
2023-04-23 15:58:55 -04:00
comfyanonymous
deb2b93e79
Move code to empty gpu cache to model_management.py
2023-04-15 11:19:07 -04:00
藍+85CD
d63705d919
Support releases all unoccupied cached memory from XPU
2023-04-15 15:50:51 +08:00
pythongosssss
6f72c4c6ff
Allows nodes to return ui data and output data
...
Fire executed event on node when message received
2023-03-29 18:53:24 +01:00
Davemane42
1e0f2b232b
add unique_id to nodes hidden inputs
...
@classmethod
def INPUT_TYPES(cls):
return {
"hidden": {"unique_id": "UNIQUE_ID"},
}
2023-03-28 02:52:12 -04:00
comfyanonymous
bb1223d83f
Fix errors appearing more than once.
2023-03-27 02:16:58 -04:00
comfyanonymous
3444ffff3b
Fix IS_CHANGED not working on nodes with an input from another node.
2023-03-27 01:56:22 -04:00
comfyanonymous
f67c00622f
Use inference_mode instead of no_grad.
2023-03-22 03:48:26 -04:00
pythongosssss
5c55c93367
Updated to reuse session id if available
2023-03-07 13:24:15 +00:00
comfyanonymous
c8ce599a8f
Add a button to interrupt processing to the ui.
2023-03-02 15:24:51 -05:00
comfyanonymous
69cc75fbf8
Add a way to interrupt current processing in the backend.
2023-03-02 14:42:03 -05:00
comfyanonymous
5f0f97634f
Only clear cuda cache on CUDA since it causes slowdowns on ROCm.
2023-02-28 13:39:30 -05:00
comfyanonymous
cd85f876f2
Try to clear more memory at the end of each prompt execution.
2023-02-28 11:56:33 -05:00
comfyanonymous
49d2e5bb5a
Move some stuff from main.py to execution.py
2023-02-27 19:44:58 -05:00