Length one processing
Internally, the Comfy server represents data flowing from one node to the next as a Pythonlist
, normally length 1, of the relevant datatype.
In normal operation, when a node returns an output, each element in the output tuple
is separately wrapped in a list (length 1); then when the
next node is called, the data is unwrapped and passed to the main function.
You generally don’t need to worry about this, since Comfy does the wrapping and unwrapping.
This isn’t about batches. A batch (of, for instance, latents, or images) is a single entry in the list (see tensor datatypes)
List processing
In some circumstance, multiple data instances are processed in a single workflow, in which case the internal data will be a list containing the data instances. An example of this might be processing a series of images one at a time to avoid running out of VRAM, or handling images of different sizes. By default, Comfy will process the values in the list sequentially:- if the inputs are
list
s of different lengths, the shorter ones are padded by repeating the last value - the main method is called once for each value in the input lists
- the outputs are
list
s, each of which is the same length as the longest input
map_node_over_list
in execution.py
.
However, as Comfy wraps node outputs into a list
of length one, if the tuple
returned by
a custom node contains a list
, that list
will be wrapped, and treated as a single piece of data.
In order to tell Comfy that the list being returned should not be wrapped, but treated as a series of data for sequential processing,
the node should provide a class attribute OUTPUT_IS_LIST
, which is a tuple[bool]
, of the same length as RETURN_TYPES
, specifying
which outputs which should be so treated.
A node can also override the default input behaviour and receive the whole list in a single call. This is done by setting a class attribute
INPUT_IS_LIST
to True
.
Here’s a (lightly annotated) example from the built in nodes - ImageRebatch
takes one or more batches of images (received as a list, because INPUT_IS_LIST - True
)
and rebatches them into batches of the requested size.
INPUT_IS_LIST
is node level - all inputs get the same treatment. So the value of the batch_size
widget is given by batch_size[0]
.