Commit Graph

50 Commits

Author SHA1 Message Date
comfyanonymous 20d3852aa1 Pull some small changes from the other repo. 2023-10-11 20:38:48 -04:00
pythongosssss 62799c8585 fix crash on node with VALIDATE_INPUTS and actual inputs 2023-09-07 18:42:21 +01:00
comfyanonymous 89a0767abf Smarter memory management.
Try to keep models on the vram when possible.

Better lowvram mode for controlnets.
2023-08-17 01:06:34 -04:00
Michael Poutre 90b0163524
fix(execution): Fix support for input-less nodes 2023-08-01 12:29:01 -07:00
Michael Poutre 7785d073f0
chore: Fix typo 2023-08-01 12:27:50 -07:00
comfyanonymous 09386a3697 Fix issue with lora in some cases when combined with model merging. 2023-07-21 21:27:27 -04:00
comfyanonymous 6e9f28401f Persist node instances between executions instead of deleting them.
If the same node id with the same class exists between two executions the
same instance will be used.

This means you can now cache things in nodes for more efficiency.
2023-06-29 23:38:56 -04:00
comfyanonymous d52ed407a7 Send websocket message only when prompt is actually done executing. 2023-06-13 13:38:43 -04:00
comfyanonymous af91df85c2 Add a /history/{prompt_id} endpoint. 2023-06-12 14:34:30 -04:00
comfyanonymous ad81fd682a Fix issue with cancelling prompt. 2023-05-28 00:32:26 -04:00
space-nuko 03f2d0a764 Rename exception message field 2023-05-27 21:06:07 -05:00
space-nuko 52c9590b7b Exception message 2023-05-27 21:06:07 -05:00
space-nuko 62bdd9d26a Catch typecast errors 2023-05-27 21:06:07 -05:00
space-nuko a9e7e23724 Fix 2023-05-27 21:06:07 -05:00
space-nuko e2d080b694 Return null for value format 2023-05-27 21:06:07 -05:00
space-nuko 6b2a8a3845 Show message in the frontend if prompt execution raises an exception 2023-05-27 21:06:07 -05:00
space-nuko ffec815257 Send back more information about exceptions that happen during execution 2023-05-27 21:06:07 -05:00
space-nuko 0d834e3a2b Add missing input name/config 2023-05-27 21:06:07 -05:00
space-nuko c33b7c5549 Improve invalid prompt error message 2023-05-27 21:06:07 -05:00
space-nuko 73e85fb3f4 Improve error output for failed nodes 2023-05-27 21:06:07 -05:00
comfyanonymous 48fcc5b777 Parsing error crash. 2023-05-22 20:51:30 -04:00
comfyanonymous ffc56c53c9 Add a node_errors to the /prompt error json response.
"node_errors" contains a dict keyed by node ids. The contents are a message
and a list of dependent outputs.
2023-05-22 13:22:38 -04:00
comfyanonymous 516119ad83 Print min and max values in validation error message. 2023-05-21 00:24:28 -04:00
comfyanonymous 1dd846a7ba Fix outputs gone from history. 2023-05-15 00:27:28 -04:00
comfyanonymous 9bf67c4c5a Print prompt execution time. 2023-05-14 01:34:25 -04:00
comfyanonymous 44f9f9baf1 Add the prompt id to some websocket messages. 2023-05-13 11:17:16 -04:00
BlenderNeko 1201d2eae5
Make nodes map over input lists (#579)
* allow nodes to map over lists

* make work with IS_CHANGED and VALIDATE_INPUTS

* give list outputs distinct socket shape

* add rebatch node

* add batch index logic

* add repeat latent batch

* deal with noise mask edge cases in latentfrombatch
2023-05-13 11:15:45 -04:00
comfyanonymous dfc74c19d9 Add the prompt_id to some websocket messages. 2023-05-11 01:22:40 -04:00
comfyanonymous 3a7c3acc72 Send websocket message with list of cached nodes right before execution. 2023-05-10 15:59:24 -04:00
comfyanonymous 602095f614 Send execution_error message on websocket on execution exception. 2023-05-10 15:49:49 -04:00
comfyanonymous d6dee8af1d Only validate each input once. 2023-05-10 00:29:31 -04:00
comfyanonymous 02ca1c67f8 Don't print traceback when processing interrupted. 2023-05-09 23:51:52 -04:00
comfyanonymous 3a1f9dba20 If IS_CHANGED returns exception delete the output instead of crashing. 2023-04-26 02:13:56 -04:00
comfyanonymous 951c0c2bbe Don't keep cached outputs for removed nodes. 2023-04-26 02:05:57 -04:00
comfyanonymous 0ac319fd81 Don't delete all outputs when execution gets interrupted. 2023-04-23 22:44:38 -04:00
comfyanonymous ccad603b2e Add a way for nodes to validate their own inputs. 2023-04-23 16:03:26 -04:00
ltdrdata f7a8218814
Add clipspace feature. (#541)
* Add clipspace feature.
* feat: copy content to clipspace
* feat: paste content from clipspace

Extend validation to allow for validating annotated_path in addition to other parameters.

Add support for annotated_filepath in folder_paths function.

Generalize the '/upload/image' API to allow for uploading images to the 'input', 'temp', or 'output' directories.

* rename contentClipboard -> clipspace

* Do deep copy for imgs on copy to clipspace.

* add original_imgs into clipspace
* Preserve the original image when 'imgs' are modified

* robust patch & refactoring folder_paths about annotated_filepath

* Only show the Paste menu if the ComfyApp.clipspace is not empty

* instant refresh on paste

force triggering 'changed' on paste action

* subfolder fix on paste logic

attach subfolder if subfolder isn't empty

---------

Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>
2023-04-23 15:58:55 -04:00
comfyanonymous deb2b93e79 Move code to empty gpu cache to model_management.py 2023-04-15 11:19:07 -04:00
藍+85CD d63705d919 Support releases all unoccupied cached memory from XPU 2023-04-15 15:50:51 +08:00
pythongosssss 6f72c4c6ff Allows nodes to return ui data and output data
Fire executed event on node when message received
2023-03-29 18:53:24 +01:00
Davemane42 1e0f2b232b add unique_id to nodes hidden inputs
@classmethod
    def INPUT_TYPES(cls):
        return {
            "hidden": {"unique_id": "UNIQUE_ID"},
        }
2023-03-28 02:52:12 -04:00
comfyanonymous bb1223d83f Fix errors appearing more than once. 2023-03-27 02:16:58 -04:00
comfyanonymous 3444ffff3b Fix IS_CHANGED not working on nodes with an input from another node. 2023-03-27 01:56:22 -04:00
comfyanonymous f67c00622f Use inference_mode instead of no_grad. 2023-03-22 03:48:26 -04:00
pythongosssss 5c55c93367 Updated to reuse session id if available 2023-03-07 13:24:15 +00:00
comfyanonymous c8ce599a8f Add a button to interrupt processing to the ui. 2023-03-02 15:24:51 -05:00
comfyanonymous 69cc75fbf8 Add a way to interrupt current processing in the backend. 2023-03-02 14:42:03 -05:00
comfyanonymous 5f0f97634f Only clear cuda cache on CUDA since it causes slowdowns on ROCm. 2023-02-28 13:39:30 -05:00
comfyanonymous cd85f876f2 Try to clear more memory at the end of each prompt execution. 2023-02-28 11:56:33 -05:00
comfyanonymous 49d2e5bb5a Move some stuff from main.py to execution.py 2023-02-27 19:44:58 -05:00