If you’ve ever dragged a spaghetti of nodes in ComfyUI and thought “this could be even smarter,” ControlNet and custom node extensions are how you level up. ComfyUI is a node‑based interface for diffusion models where you wire components together like LEGO—load models, encode prompts, sample latents, upscale, and save images. Nodes make the invisible visible, and ControlNet adds surgical guidance: pose, depth, edges, segmentation—structured signals that steer generation without killing creativity.
In this guide, we’ll get your environment ready, install ControlNet and key node packs, build a few practical workflows, and cover the quirks that trip people up. By the end, you’ll be confident wiring ControlNet into your graphs, stacking multiple conditions, and mixing in other extensions for production‑grade results.
What you’ll learn
- What ControlNet is and how it plugs into ComfyUI’s node graph
- How custom nodes and preprocessors expand ComfyUI beyond the core
- Clean install flow: Manager → Aux Preprocessors → ControlNet models → basic graph
- Practical workflows: pose, depth, edges/lineart, segmentation, multi‑ControlNet
- Best practices: strength, start/end influence, version matching (SD 1.5 vs SDXL)
- Troubleshooting: models not detected, path issues, preprocessors missing
Technical background: ControlNet and the ComfyUI extension model
ComfyUI in one paragraph
ComfyUI is a visual, node‑based editor for diffusion model pipelines. Each node performs a function—load a checkpoint, encode text with CLIP, sample latents, apply VAE—while edges connect inputs/outputs. Node graphs make complex pipelines understandable and reusable, and they invite experimentation because you can see and edit every step.
What is ControlNet?
ControlNet extends diffusion with extra conditioning signals. Instead of relying only on text prompts, ControlNet ingests structure extracted from an image—skeletons (OpenPose), depth, edges (Canny), lineart, segmentation, normal maps, and more—and nudges the denoising process to follow those constraints. The result is faithful pose/layout/geometry while leaving style and details to the model and prompts.
- Typical controls: OpenPose (skeleton), Depth, Canny/Edges, Lineart, SoftEdge, Segmentation, Normal maps
- Why it matters: precise composition, pose transfer, style replacement, consistent character/body planning
- Where to start: the ComfyUI Wiki’s “Apply ControlNet” documentation explains node parameters and signal flow.
References:
- ComfyUI Wiki — Apply ControlNet: https://comfyui-wiki.com/en/comfyui-nodes/conditioning/controlnet-apply
What are custom node extensions?
Custom nodes are community‑built modules that add preprocessors (to create control maps), model loaders, samplers, utilities, and quality‑of‑life tools. They plug into ComfyUI’s graph system and show up as new nodes. The “Custom Nodes Manager” (in ComfyUI Manager) makes installation point‑and‑click.
Examples:
- ControlNet Auxiliary Preprocessors (aka “ControlNet Aux Preprocessors”): a bundle of preprocessors like OpenPose, Canny, Lineart, etc.
- Utility packs for batching, metadata, prompt routing, image IO, or performance helpers
Reference:
- Medium (overview articles) often contrast base ComfyUI vs extended nodes and show how Aux Preprocessors slot into workflows.
Installation & setup: ControlNet and node extensions
We’ll go from zero to a working ControlNet pipeline using ComfyUI Manager and Aux Preprocessors, then add the ControlNet models and wire a minimal graph.
1) Install or update ComfyUI
- Use your existing ComfyUI installation (Portable or Desktop). If you’re new to ComfyUI, see our guide: ComfyUI Portable vs Desktop.
- Update to the latest release to avoid API mismatches with newer nodes.
2) Install ComfyUI Manager (if you don’t have it yet)
- Launch ComfyUI and open the Manager UI (or install it from its repository per instructions). The Manager adds a “Custom Nodes Manager” and update tools.
Reference (example walkthroughs):
- lavivienpost.net — step‑by‑step screenshots for using the Custom Nodes Manager to install ControlNet Aux Preprocessors
3) Install ControlNet Aux Preprocessors
- In ComfyUI Manager, click “Custom Nodes Manager”
- Search for “controlnet” or “aux” and locate “ControlNet Aux Preprocessors”
- Click Install
- Restart ComfyUI so the new nodes appear in the graph editor
References:
- ComfyUI tutorials and posts show this exact flow (Manager → Custom Nodes → search → install → restart)
4) Download ControlNet models
ControlNet needs the actual model weights for each control type.
- Common SD 1.5 model names include:
control_v11p_sd15_openpose.pthcontrol_v11f1p_sd15_depth.pthcontrol_v11p_sd15_canny.pthcontrol_v11p_sd15_lineart.pth
- SDXL and FLUX use different model families—use matching ControlNet weights when available.
- Place the
.pth/.safetensorsfiles in:ComfyUI/models/controlnet/
ComfyUI will scan this directory and populate dropdowns in “Load ControlNet Model” nodes.
References:
- ComfyUI Wiki — Install ControlNet Models: https://comfyui-wiki.com/en/install/install-models/install-controlnet
Tip: If your models live elsewhere, you can declare paths in extra_model_paths.yaml (advanced). This helps when you keep models on a separate drive.
5) Build a minimal ControlNet workflow
We’ll create a basic graph you can iterate on.
- Load Checkpoint → pick your base model (e.g., SD 1.5, SDXL, or FLUX)
- CLIP Text Encode → positive prompt
- CLIP Text Encode → negative prompt
- Preprocessor (from Aux Preprocessors) → e.g., OpenPose (input: your reference image)
- Load ControlNet Model → choose the matching control (e.g., OpenPose)
- Apply ControlNet → wire preprocessor output + ControlNet model
- Sampler (e.g., DPM++ 2M Karras) → connect conditioning from text + ControlNet
- VAE Decode / Save Image
Reference:
- ComfyUI Wiki — Apply ControlNet: parameters for strength, start_percent, end_percent
- ComfyUI Wiki — Tutorial on installing and using ControlNet: https://comfyui-wiki.com/en/tutorial/advanced/how-to-install-and-use-controlnet-models-in-comfyui
Once this minimal graph works, you can branch it for multiple ControlNets, add LoRAs, or insert upscalers and post‑processors.
Troubleshooting install & setup
-
ControlNet model not detected
- Verify the file is in
ComfyUI/models/controlnet/ - Confirm extension
.pthor.safetensors - If using custom paths, update
extra_model_paths.yamland restart
- Verify the file is in
-
Preprocessor nodes missing
- Ensure “ControlNet Aux Preprocessors” is installed via Manager
- Restart ComfyUI; some nodes load only on fresh start
-
Extra model paths not respected
- Confirm YAML indentation; a single space error can break parsing
- Use absolute paths; avoid special characters in folder names
-
Wrong model family (SD 1.5 vs SDXL vs FLUX)
- Use matching ControlNet weights; mixing families degrades results or errors out
-
Graph errors after an update
- Update all custom nodes in Manager
- Clear caches, restart, and re‑select models in loader nodes
Reference:
- Reddit — “Troubleshooting ComfyUI + Controlnet installation”: https://www.reddit.com/r/comfyui/comments/1734xc1/troubleshooting_comfyui_controlnet_installation/
Creative use‑cases & workflow examples
ControlNet is a toolbox—each control type enables a different kind of “structure lock.” Here are practical patterns worth mastering.
1) Pose transfer with OpenPose
- Goal: keep the body pose from a reference image but change identity and style
- Graph: Image → OpenPose Preprocessor → Load ControlNet (OpenPose) → Apply ControlNet
- Prompts: describe style/outfit/lighting; avoid over‑specifying pose (ControlNet handles it)
- Tips: start with strength around 0.8–1.2 and adjust; use
start_percent~0.0 andend_percent~0.8 for strong early guidance that relaxes later
2) Depth‑guided scene consistency
- Goal: preserve scene geometry (walls, furniture, perspective) while restyling
- Graph: Image → Depth Preprocessor → Load ControlNet (Depth) → Apply ControlNet
- Tips: great for interior design mockups, environment restyles, or photoreal→stylized conversions
- Pair with a mild edge control (Canny) for sharper structural fidelity
3) Edge/Lineart for clean stylization
- Goal: maintain contours while radically changing textures and colors
- Graph: Image → Canny/Lineart Preprocessor → Load ControlNet (Canny/Lineart) → Apply ControlNet
- Tips: perfect for anime/manga transitions, posterization, graphic styles
4) Segmentation for layout protection
- Goal: keep object regions where they belong while changing style within regions
- Graph: Image → Segmentation Preprocessor → Load ControlNet (Segmentation) → Apply ControlNet
- Tips: combine with prompts that mention regions (“red jacket”, “blue wall”) and guide CFG moderately
5) Multi‑ControlNet stacking
- Goal: combine complementary constraints (pose + depth + edges)
- Graph: Parallel preprocessors → separate ControlNet models → multiple Apply ControlNet nodes feeding the sampler
- Tips: order doesn’t usually matter; the sampler aggregates conditionings. Start with moderate strengths (0.5–0.9 each) and expand. Some community builds mention nodes like “SetUnionControlNetType” to manage multi‑control blending—these appear in release notes and Reddit threads when new primitives land.
Reference:
- Reddit — “ComfyUI just updated with a new node implementing proper support…”: https://www.reddit.com/r/StableDiffusion/comments/1e55qol/comfyui_just_updated_with_a_new_node_implementing/
6) Batch flows, LoRAs, and animation ideas
- Batch: use utility nodes to loop over a folder of inputs; apply a consistent control and prompt template
- LoRAs: load a style/character LoRA and keep its weight modest (e.g.,
1.0) so ControlNet still drives structure - Animation/video: some preprocessors output per‑frame controls; for stable motion, keep the control map consistent frame‑to‑frame (advanced)
Best practices, pitfalls, and optimization
Version match matters
- Use ControlNet weights that match your base model family: SD 1.5 weights for SD 1.5, SDXL weights for SDXL, FLUX‑compatible controls for FLUX. Mismatches reduce quality or fail to load.
Tune strength and influence window
strengthcontrols how strongly ControlNet guides the diffusion. Many tutorials suggest ~0.5–1.5 as a practical band. Higher can over‑constrain, lower can get ignored.start_percentandend_percentdefine the time window (in denoising percent) where ControlNet is active. Early‑heavy control stabilizes composition; tapering toward the end can restore texture freedom.
Reference:
- ComfyUI Wiki — Apply ControlNet: parameter guidance and examples
Hardware and VRAM
- Each ControlNet adds compute and memory. Depth and segmentation preprocessors can be heavy; batch large jobs only when you’ve tested a single pass.
- If you hit VRAM limits, try smaller resolution, fewer controls, lower batch size, or mixed precision.
Frequent pitfalls
- Models not showing up: wrong folder or filename, need restart
- Preprocessors not installed: verify via Manager; reinstall if node errors appear
- Path issues: set
extra_model_paths.yamlwith correct, absolute paths; recheck YAML indentation - Compatibility: keep Manager and custom nodes updated after ComfyUI core updates; some APIs change
Where to go next
- Build a reusable “ControlNet starter” graph with slots for OpenPose/Depth/Edges. Save it as your template.
- Try multi‑ControlNet on a small resolution first; then scale up to 1024/4K once you’re happy.
- Mix with LoRAs for style and character control—ControlNet nails structure, LoRAs bring identity and aesthetics.
- Share your workflows and results on r/comfyui; the best tricks spread fast in the community.
Related guides on this site
References and further reading
- ComfyUI Wiki — Apply ControlNet: https://comfyui-wiki.com/en/comfyui-nodes/conditioning/controlnet-apply
- ComfyUI Wiki — Install ControlNet Models: https://comfyui-wiki.com/en/install/install-models/install-controlnet
- ComfyUI Wiki — How to Install and Use ControlNet: https://comfyui-wiki.com/en/tutorial/advanced/how-to-install-and-use-controlnet-models-in-comfyui
- Reddit — Troubleshooting ComfyUI + ControlNet installation: https://www.reddit.com/r/comfyui/comments/1734xc1/troubleshooting_comfyui_controlnet_installation/
- Reddit — New node implementing proper support discussion: https://www.reddit.com/r/StableDiffusion/comments/1e55qol/comfyui_just_updated_with_a_new_node_implementing/
- Medium — Overviews contrasting base ComfyUI vs ControlNet Aux Preprocessors (various articles)
- lavivienpost.net — Installing ControlNet Aux Preprocessors via Manager (tutorial)
Conclusion
ControlNet is the bridge between free‑form prompting and deliberate composition. In ComfyUI, it’s just a few nodes: a preprocessor to extract structure, a ControlNet model, and an Apply node to wire it all into your sampler. Once you’ve got the install path down—Manager → Aux Preprocessors → control weights—you can iterate on strength, timing, and stacks of controls to dial in exactly what you want.
From here, try a three‑control stack (OpenPose + Depth + Canny), then layer a style LoRA on top. Save your best graphs as templates, and keep your nodes updated. With a little practice, you’ll wonder how you built scenes without ControlNet.