フォロー
Banggu Wu
Banggu Wu
ByteDance
確認したメール アドレス: bytedance.com
タイトル
引用先
引用先
ECA-Net: Efficient channel attention for deep convolutional neural networks
Q Wang, B Wu, P Zhu, P Li, W Zuo, Q Hu
Proceedings of the IEEE/CVF conference on computer vision and pattern …, 2020
70342020
What deep CNNs benefit from global covariance pooling: An optimization perspective
Q Wang, L Zhang, B Wu, D Ren, P Li, W Zuo, Q Hu
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
292020
Multi-branch structure based local channel attention with uncertainty
BG Wu, SL Zhang, H Shi, PF Zhu, QL Wang, QH Hu
Acta Electon. Sin 50, 374, 2022
32022
Over-tokenized transformer: Vocabulary is generally worth scaling
H Huang, D Zhu, B Wu, Y Zeng, Y Wang, Q Min, X Zhou
arXiv preprint arXiv:2501.16975, 2025
12025
基于多分支结构的不确定性局部通道注意力机制
伍邦谷, 张苏林, 石红, 朱鹏飞, 王旗龙, 胡清华
电子学报 50 (2), 374-382, 2022
12022
Hyper-Connections
D Zhu, H Huang, Z Huang, Y Zeng, Y Mao, B Wu, Q Min, X Zhou
arXiv preprint arXiv:2409.19606, 2024
2024
Supplementary Material for “What Deep CNNs Benefit from Global Covariance Pooling: An Optimization Perspective”
Q Wang, L Zhang, B Wu, D Ren, P Li, W Zuo, Q Hu
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–7