摘要
Gating transformation demonstrates great potential in recent deep convolutional neural networks design, enriching the feature representation and alleviating noisy signals by modeling the inter-channel dependencies using learnable parameters. However, the utilization of scaling approaches to reduce the redundancy of the hand-crafted attention mechanism has rarely been investigated. This paper proposes a novel scaled gated convolution that enables attention-enhanced CNNs to overcome the paradox between performance and redundancy. Our scaled gated convolution is a simple and effective alternative compared with both vanilla convolution and attention-enhanced convolutions, which can be easily applied to modern CNNs in a plug-and-play manner. Exhaustive experiments demonstrate that stacking scaled gated convolutions in baselines can significantly improve the performance in a broad range of visual recognition tasks, including image recognition, object detection, instance segmentation, keypoint detection, and panoptic segmentation, while obtaining a better trade-off between performance and attentive redundancy.
| 源语言 | 英语 |
|---|---|
| 页(从-至) | 1583-1606 |
| 页数 | 24 |
| 期刊 | World Wide Web |
| 卷 | 25 |
| 期 | 4 |
| DOI | |
| 出版状态 | 已出版 - 7月 2022 |
学术指纹
探究 'Scaled gated networks' 的科研主题。它们共同构成独一无二的指纹。引用此
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver