ECA-Net: Efficient channel attention for deep convolutional neural networks Q Wang, B Wu, P Zhu, P Li, W Zuo, Q Hu Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2020 | 7034 | 2020 |
What deep CNNs benefit from global covariance pooling: An optimization perspective Q Wang, L Zhang, B Wu, D Ren, P Li, W Zuo, Q Hu Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020 | 29 | 2020 |
Multi-branch structure based local channel attention with uncertainty BG Wu, SL Zhang, H Shi, PF Zhu, QL Wang, QH Hu Acta Electon. Sin 50, 374, 2022 | 3 | 2022 |
Over-tokenized transformer: Vocabulary is generally worth scaling H Huang, D Zhu, B Wu, Y Zeng, Y Wang, Q Min, X Zhou arXiv preprint arXiv:2501.16975, 2025 | 1 | 2025 |
基于多分支结构的不确定性局部通道注意力机制 伍邦谷, 张苏林, 石红, 朱鹏飞, 王旗龙, 胡清华 电子学报 50 (2), 374-382, 2022 | 1 | 2022 |
Hyper-Connections D Zhu, H Huang, Z Huang, Y Zeng, Y Mao, B Wu, Q Min, X Zhou arXiv preprint arXiv:2409.19606, 2024 | | 2024 |
Supplementary Material for “What Deep CNNs Benefit from Global Covariance Pooling: An Optimization Perspective” Q Wang, L Zhang, B Wu, D Ren, P Li, W Zuo, Q Hu | | |