Commit Graph

63 Commits

Author SHA1 Message Date
comfyanonymous deb2b93e79 Move code to empty gpu cache to model_management.py 2023-04-15 11:19:07 -04:00
藍+85CD d63705d919 Support releases all unoccupied cached memory from XPU 2023-04-15 15:50:51 +08:00
pythongosssss 6f72c4c6ff Allows nodes to return ui data and output data
Fire executed event on node when message received
2023-03-29 18:53:24 +01:00
Davemane42 1e0f2b232b add unique_id to nodes hidden inputs
@classmethod
    def INPUT_TYPES(cls):
        return {
            "hidden": {"unique_id": "UNIQUE_ID"},
        }
2023-03-28 02:52:12 -04:00
comfyanonymous bb1223d83f Fix errors appearing more than once. 2023-03-27 02:16:58 -04:00
comfyanonymous 3444ffff3b Fix IS_CHANGED not working on nodes with an input from another node. 2023-03-27 01:56:22 -04:00
comfyanonymous f67c00622f Use inference_mode instead of no_grad. 2023-03-22 03:48:26 -04:00
pythongosssss 5c55c93367 Updated to reuse session id if available 2023-03-07 13:24:15 +00:00
comfyanonymous c8ce599a8f Add a button to interrupt processing to the ui. 2023-03-02 15:24:51 -05:00
comfyanonymous 69cc75fbf8 Add a way to interrupt current processing in the backend. 2023-03-02 14:42:03 -05:00
comfyanonymous 5f0f97634f Only clear cuda cache on CUDA since it causes slowdowns on ROCm. 2023-02-28 13:39:30 -05:00
comfyanonymous cd85f876f2 Try to clear more memory at the end of each prompt execution. 2023-02-28 11:56:33 -05:00
comfyanonymous 49d2e5bb5a Move some stuff from main.py to execution.py 2023-02-27 19:44:58 -05:00