SaveLatent¶
Documentation¶
- Class name:
SaveLatent
- Category:
_for_testing
- Output node:
True
The SaveLatent node is designed for saving latent representations to a specified storage. It abstracts the process of persisting latent vectors or tensors, facilitating their reuse in future computations or analyses.
Input types¶
Required¶
samples
- The 'samples' parameter represents the latent representations to be saved. It is crucial for specifying the data that will be persisted for future use.
- Comfy dtype:
LATENT
- Python dtype:
Dict[str, torch.Tensor]
filename_prefix
- The 'filename_prefix' parameter allows for the specification of a prefix for the filename under which the latent representations will be saved. This aids in organizing and retrieving saved latents effectively.
- Comfy dtype:
STRING
- Python dtype:
str
Output types¶
The node doesn't have output types
Usage tips¶
- Infra type:
CPU
- Common nodes: unknown
Source code¶
class SaveLatent:
def __init__(self):
self.output_dir = folder_paths.get_output_directory()
@classmethod
def INPUT_TYPES(s):
return {"required": { "samples": ("LATENT", ),
"filename_prefix": ("STRING", {"default": "latents/ComfyUI"})},
"hidden": {"prompt": "PROMPT", "extra_pnginfo": "EXTRA_PNGINFO"},
}
RETURN_TYPES = ()
FUNCTION = "save"
OUTPUT_NODE = True
CATEGORY = "_for_testing"
def save(self, samples, filename_prefix="ComfyUI", prompt=None, extra_pnginfo=None):
full_output_folder, filename, counter, subfolder, filename_prefix = folder_paths.get_save_image_path(filename_prefix, self.output_dir)
# support save metadata for latent sharing
prompt_info = ""
if prompt is not None:
prompt_info = json.dumps(prompt)
metadata = None
if not args.disable_metadata:
metadata = {"prompt": prompt_info}
if extra_pnginfo is not None:
for x in extra_pnginfo:
metadata[x] = json.dumps(extra_pnginfo[x])
file = f"{filename}_{counter:05}_.latent"
results = list()
results.append({
"filename": file,
"subfolder": subfolder,
"type": "output"
})
file = os.path.join(full_output_folder, file)
output = {}
output["latent_tensor"] = samples["samples"]
output["latent_format_version_0"] = torch.tensor([])
comfy.utils.save_torch_file(output, file, metadata=metadata)
return { "ui": { "latents": results } }