Machine learning for hydrologic sciences: An introductory overview

T Xu, F Liang - Wiley Interdisciplinary Reviews: Water, 2021 - Wiley Online Library
The hydrologic community has experienced a surge in interest in machine learning in recent
years. This interest is primarily driven by rapidly growing hydrologic data repositories, as …

Resmlp: Feedforward networks for image classification with data-efficient training

H Touvron, P Bojanowski, M Caron… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
We present ResMLP, an architecture built entirely upon multi-layer perceptrons for image
classification. It is a simple residual network that alternates (i) a linear layer in which image …

Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks

S Arora, S Du, W Hu, Z Li… - … Conference on Machine …, 2019 - proceedings.mlr.press
Recent works have cast some light on the mystery of why deep nets fit any data and
generalize despite being very overparametrized. This paper analyzes training and …

Recent advances in deep learning theory

F He, D Tao - arxiv preprint arxiv:2012.10931, 2020 - arxiv.org
Deep learning is usually described as an experiment-driven field under continuous criticizes
of lacking theoretical foundations. This problem has been partially fixed by a large volume of …

DCGAN-based data augmentation for tomato leaf disease identification

Q Wu, Y Chen, J Meng - IEEE access, 2020 - ieeexplore.ieee.org
Tomato leaf disease seriously affects the yield of tomato. It is extremely vital for agricultural
economy to identify agricultural diseases. The traditional data augmentation methods, such …

Using convolutional neural network for predicting cyanobacteria concentrations in river water

JC Pyo, LJ Park, Y Pachepsky, SS Baek, K Kim… - Water Research, 2020 - Elsevier
Abstract Machine learning modeling techniques have emerged as a potential means for
predicting algal blooms. In this study, synthetic spatio-temporal water quality data for a river …

Theoretical analysis of the inductive biases in deep convolutional networks

Z Wang, L Wu - Advances in Neural Information Processing …, 2023 - proceedings.neurips.cc
In this paper, we provide a theoretical analysis of the inductive biases in convolutional
neural networks (CNNs). We start by examining the universality of CNNs, ie, the ability to …

A high-quality rice leaf disease image data augmentation method based on a dual GAN

Z Zhang, Q Gao, L Liu, Y He - IEEE Access, 2023 - ieeexplore.ieee.org
Deep learning models need sufficient training samples to support them in the training
process; otherwise, overfitting occurs, resulting in model failure. However, in the field of …

The exact sample complexity gain from invariances for kernel regression

B Tahmasebi, S Jegelka - Advances in Neural Information …, 2023 - proceedings.neurips.cc
In practice, encoding invariances into models improves sample complexity. In this work, we
study this phenomenon from a theoretical perspective. In particular, we provide minimax …

Generalization bounds for deep convolutional neural networks

PM Long, H Sedghi - arxiv preprint arxiv:1905.12600, 2019 - arxiv.org
We prove bounds on the generalization error of convolutional networks. The bounds are in
terms of the training loss, the number of parameters, the Lipschitz constant of the loss and …