Using controlNet requires preprocessing of input images. Since ComfyUI initial nodes do not come with preprocessors and controlNet models, please first install ControlNet preprocessors download preprocessors here and corresponding controlNet models.Documentation Index
Fetch the complete documentation index at: https://docs.comfy.org/llms.txt
Use this file to discover all available pages before exploring further.
Inputs
| Parameter | Data Type | Function |
|---|---|---|
positive | CONDITIONING | Positive conditioning data, from CLIP Text Encoder or other conditioning inputs |
negative | CONDITIONING | Negative conditioning data, from CLIP Text Encoder or other conditioning inputs |
control_net | CONTROL_NET | The controlNet model to apply, typically input from ControlNet Loader |
image | IMAGE | Image for controlNet application, needs to be processed by preprocessor |
vae | VAE | Vae model input |
strength | FLOAT | Controls the strength of network adjustments, value range 0 |
start_percent | FLOAT | Value 0.000~1.000, determines when to start applying controlNet as a percentage, e.g., 0.2 means ControlNet guidance will start influencing image generation at 20% of the diffusion process |
end_percent | FLOAT | Value 0.000~1.000, determines when to stop applying controlNet as a percentage, e.g., 0.8 means ControlNet guidance will stop influencing image generation at 80% of the diffusion process |
Outputs
| Parameter | Data Type | Function |
|---|---|---|
positive | CONDITIONING | Positive conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes |
negative | CONDITIONING | Negative conditioning data processed by ControlNet, can be output to next ControlNet or K Sampler nodes |