Post training 4-bit quantization of convolutional networks for rapid-deployment R Banner, Y Nahshan, D Soudry Advances in Neural Information Processing Systems 32, 2019 | 801* | 2019 |
Accurate post training quantization with small calibration sets I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry International Conference on Machine Learning, 4466-4475, 2021 | 323* | 2021 |
Loss aware post-training quantization Y Nahshan, B Chmiel, C Baskin, E Zheltonozhskii, R Banner, ... Machine Learning 110 (11), 3245-3262, 2021 | 185 | 2021 |
Robust quantization: One model to rule them all B Chmiel, R Banner, G Shomron, Y Nahshan, A Bronstein, U Weiser Advances in neural information processing systems 33, 5308-5317, 2020 | 92 | 2020 |
Improving post training neural quantization: Layer-wise calibration and integer programming. arXiv 2020 I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry arXiv preprint arXiv:2006.10518, 2020 | 12 | 2020 |
Linear Log-Normal Attention with Unbiased Concentration Y Nahshan, J Kampeas, E Haleva ICLR 2024, 2023 | 5 | 2023 |
Rotation Invariant Quantization for Model Compression J Kampeas, Y Nahshan, H Kremer, G Lederman, S Zaloshinski, Z Li, ... arXiv preprint arXiv:2303.03106, 2023 | | 2023 |
ACIQ: ANALYTICAL CLIPPING FOR INTEGER QUAN R Banner, Y Nahshan, E Hoffer, D Soudry arXiv preprint arXiv:1810.05723, 2018 | | 2018 |
Supplementary Material: Accurate Post Training Quantization With Small Calibration Sets I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry | | |