Deep convolutional neural networks for image classification: A comprehensive review

W Rawat, Z Wang - Neural computation, 2017 - ieeexplore.ieee.org
Convolutional neural networks (CNNs) have been applied to visual tasks since the late
1980s. However, despite a few scattered applications, they were dormant until the mid …

Transfer learning for bayesian optimization: A survey

T Bai, Y Li, Y Shen, X Zhang, W Zhang… - arxiv preprint arxiv …, 2023 - arxiv.org
A wide spectrum of design and decision problems, including parameter tuning, A/B testing
and drug design, intrinsically are instances of black-box optimization. Bayesian optimization …

A neural space-time representation for text-to-image personalization

Y Alaluf, E Richardson, G Metzer… - ACM Transactions on …, 2023 - dl.acm.org
A key aspect of text-to-image personalization methods is the manner in which the target
concept is represented within the generative process. This choice greatly affects the visual …

Fjord: Fair and accurate federated learning under heterogeneous targets with ordered dropout

S Horvath, S Laskaridis, M Almeida… - Advances in …, 2021 - proceedings.neurips.cc
Federated Learning (FL) has been gaining significant traction across different ML tasks,
ranging from vision to keyboard predictions. In large-scale deployments, client heterogeneity …

Matryoshka representation learning

A Kusupati, G Bhatt, A Rege… - Advances in …, 2022 - proceedings.neurips.cc
Learned representations are a central component in modern ML systems, serving a
multitude of downstream tasks. When training such representations, it is often the case that …

Sparse low-rank adaptation of pre-trained language models

N Ding, X Lv, Q Wang, Y Chen, B Zhou, Z Liu… - arxiv preprint arxiv …, 2023 - arxiv.org
Fine-tuning pre-trained large language models in a parameter-efficient manner is widely
studied for its effectiveness and efficiency. The popular method of low-rank adaptation …

Transformer-based transform coding

Y Zhu, Y Yang, T Cohen - International conference on learning …, 2022 - openreview.net
Neural data compression based on nonlinear transform coding has made great progress
over the last few years, mainly due to improvements in prior models, quantization methods …

Ordered neurons: Integrating tree structures into recurrent neural networks

Y Shen, S Tan, A Sordoni, A Courville - arxiv preprint arxiv:1810.09536, 2018 - arxiv.org
Natural language is hierarchically structured: smaller units (eg, phrases) are nested within
larger units (eg, clauses). When a larger constituent ends, all of the smaller constituents that …

Enlarging smaller images before inputting into convolutional neural network: zero-padding vs. interpolation

M Hashemi - Journal of Big Data, 2019 - Springer
The input to a machine learning model is a one-dimensional feature vector. However, in
recent learning models, such as convolutional and recurrent neural networks, two-and three …