TY - JOUR
T1 - TRG-Net
T2 - An Interpretable and Controllable Rain Generator
AU - Pang, Zhiqiang
AU - Wang, Hong
AU - Xie, Qi
AU - Meng, Deyu
AU - Xu, Zongben
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Exploring and modeling the rain generation mechanism is critical for augmenting paired data to ease the training of rainy image processing models. Most of the conventional methods handle this task in an artificial physical rendering manner, through elaborately designing fundamental elements constituting rains. These kinds of methods, however, are over-dependent on human subjectivity, which limits their adaptability to real rains. In contrast, recent deep learning (DL) methods have achieved great success by training a neural network-based generator from pre-collected rainy image data. However, current methods usually design the generator in a “closed box” manner, increasing the learning difficulty and data requirements. To address these issues, this study proposes a novel DL-based rain generator, which fully takes the physical generation mechanism underlying rains into consideration and well encodes the learning of the fundamental rain factors (i.e., shape, orientation, length, width, and sparsity) explicitly into the deep network. Its significance lies in that the generator not only elaborately designs essential elements of the rain to simulate expected rains, like conventional artificial strategies, but also finely adapts to complicated and diverse practical rainy images, like DL methods. By rationally adopting the filter parameterization technique, the proposed rain generator is finely controllable with respect to rain factors and able to learn the distribution of these factors purely from data without the need for rain factor labels. Our unpaired generation experiments demonstrate that the rain generated by the proposed rain generator is not only of higher quality but also more effective for deraining and downstream tasks compared to current state-of-the-art rain generation methods. Besides, the paired data augmentation experiments, including both in-distribution and out-of-distribution (OOD), further validate the diversity of samples generated by our model for in-distribution deraining and OOD generalization tasks.
AB - Exploring and modeling the rain generation mechanism is critical for augmenting paired data to ease the training of rainy image processing models. Most of the conventional methods handle this task in an artificial physical rendering manner, through elaborately designing fundamental elements constituting rains. These kinds of methods, however, are over-dependent on human subjectivity, which limits their adaptability to real rains. In contrast, recent deep learning (DL) methods have achieved great success by training a neural network-based generator from pre-collected rainy image data. However, current methods usually design the generator in a “closed box” manner, increasing the learning difficulty and data requirements. To address these issues, this study proposes a novel DL-based rain generator, which fully takes the physical generation mechanism underlying rains into consideration and well encodes the learning of the fundamental rain factors (i.e., shape, orientation, length, width, and sparsity) explicitly into the deep network. Its significance lies in that the generator not only elaborately designs essential elements of the rain to simulate expected rains, like conventional artificial strategies, but also finely adapts to complicated and diverse practical rainy images, like DL methods. By rationally adopting the filter parameterization technique, the proposed rain generator is finely controllable with respect to rain factors and able to learn the distribution of these factors purely from data without the need for rain factor labels. Our unpaired generation experiments demonstrate that the rain generated by the proposed rain generator is not only of higher quality but also more effective for deraining and downstream tasks compared to current state-of-the-art rain generation methods. Besides, the paired data augmentation experiments, including both in-distribution and out-of-distribution (OOD), further validate the diversity of samples generated by our model for in-distribution deraining and OOD generalization tasks.
KW - Data augmentation
KW - interpretable network
KW - rain generation
KW - unpaired data generation
UR - https://www.scopus.com/pages/publications/105006841773
U2 - 10.1109/TNNLS.2025.3565726
DO - 10.1109/TNNLS.2025.3565726
M3 - 文章
C2 - 40424114
AN - SCOPUS:105006841773
SN - 2162-237X
VL - 36
SP - 16745
EP - 16759
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 9
ER -