1d ago

Normalization technique reuses fixed denoising models across noise levels

0

A normalization technique enables denoising models trained at one noise level to handle unseen levels during iterative sampling by applying input normalization before denoising and output denormalization afterward. A SwinIR network trained via Noise2Noise at fixed σ=10 was inserted unchanged into the Kadkhodaie and Simoncelli constrained iterative sampler. On Set12 images for 10% random inpainting the method raised PSNR from 6.08 dB to 23.87 dB while using identical weights and training pairs.

Original post

1/ What happens when a denoiser trained at one noise level is reused inside an iterative sampler? We trained SwinIR with Noise2Noise at σ=10, then dropped it unchanged into the constrained sampler of @ZKadkhodaie & @EeroSimoncelli (NeurIPS ’21) for 10% random inpainting. Baseline SwinIR: 6.08 dB on Set12. SwinIR-WNE: 23.87 dB. Same backbone. Same N2N pairs. Same sampler. With François Fleuret @francoisfleuret. ICML 2026. 🧵

7:01 AM · May 15, 2026 View on X
Reposted by

TL;DR: To denoise with levels of noise unseen during training simply normalize / denoise / denormalize. The resulting equivariance shines for sampling!

@YoussefMMSaied (ICML 2026)

@sciences_UNIGE @UNIGEnews

Youssef SaiedYoussef Saied@YoussefMMSaied

1/ What happens when a denoiser trained at one noise level is reused inside an iterative sampler? We trained SwinIR with Noise2Noise at σ=10, then dropped it unchanged into the constrained sampler of @ZKadkhodaie & @EeroSimoncelli (NeurIPS ’21) for 10% random inpainting. Baseline SwinIR: 6.08 dB on Set12. SwinIR-WNE: 23.87 dB. Same backbone. Same N2N pairs. Same sampler. With François Fleuret @francoisfleuret. ICML 2026. 🧵

2:01 PM · May 15, 2026 · 17.9K Views
2:12 PM · May 15, 2026 · 13.9K Views
Normalization technique reuses fixed denoising models across noise levels · Digg