跳到主要导航 跳到搜索 跳到主要内容

Scaled gated networks

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Gating transformation demonstrates great potential in recent deep convolutional neural networks design, enriching the feature representation and alleviating noisy signals by modeling the inter-channel dependencies using learnable parameters. However, the utilization of scaling approaches to reduce the redundancy of the hand-crafted attention mechanism has rarely been investigated. This paper proposes a novel scaled gated convolution that enables attention-enhanced CNNs to overcome the paradox between performance and redundancy. Our scaled gated convolution is a simple and effective alternative compared with both vanilla convolution and attention-enhanced convolutions, which can be easily applied to modern CNNs in a plug-and-play manner. Exhaustive experiments demonstrate that stacking scaled gated convolutions in baselines can significantly improve the performance in a broad range of visual recognition tasks, including image recognition, object detection, instance segmentation, keypoint detection, and panoptic segmentation, while obtaining a better trade-off between performance and attentive redundancy.

源语言英语
页(从-至)1583-1606
页数24
期刊World Wide Web
25
4
DOI
出版状态已出版 - 7月 2022

学术指纹

探究 'Scaled gated networks' 的科研主题。它们共同构成独一无二的指纹。

引用此