Don't decay the learning rate, increase the batch size SL Smith, PJ Kindermans, C Ying, QV Le arXiv preprint arXiv:1711.00489, 2017 | 1319 | 2017 |
Nas-bench-101: Towards reproducible neural architecture search C Ying, A Klein, E Christiansen, E Real, K Murphy, F Hutter International conference on machine learning, 7105-7114, 2019 | 869 | 2019 |
Image classification at supercomputer scale C Ying, S Kumar, D Chen, T Wang, Y Cheng arXiv preprint arXiv:1811.06992, 2018 | 163 | 2018 |
Large-batch training for LSTM and beyond Y You, J Hseu, C Ying, J Demmel, K Keutzer, CJ Hsieh Proceedings of the International Conference for High Performance Computing …, 2019 | 107 | 2019 |
Don’t decay the learning rate, increase the batch size. arXiv 2017 SL Smith, PJ Kindermans, C Ying, QV Le arXiv preprint arXiv:1711.00489, 2021 | 36 | 2021 |
Quoc V Le SL Smith, PJ Kindermans, C Ying A bayesian perspective on generalization and stochastic gradient descent, 2018 | 31 | 2018 |
Don’t decay the learning rate, increase the batch size (2017) SL Smith, PJ Kindermans, C Ying, QV Le arXiv preprint arXiv:1711.00489, 2018 | 17 | 2018 |
Depth-adaptive computational policies for efficient visual tracking C Ying, K Fragkiadaki Energy Minimization Methods in Computer Vision and Pattern Recognition: 11th …, 2018 | 11 | 2018 |
Enumerating unique computational graphs via an iterative graph invariant C Ying arXiv preprint arXiv:1902.06192, 2019 | 8 | 2019 |
Adaptive Depth Computational Policies for Efficient Visual Tracking C Ying, K Fragkiadaki | | |