Complex-valued neural networks: A comprehensive survey
Complex-valued neural networks (CVNNs) have shown their excellent efficiency compared
to their real counter-parts in speech enhancement, image and signal processing …
to their real counter-parts in speech enhancement, image and signal processing …
Normalization techniques in training dnns: Methodology, analysis and application
Normalization techniques are essential for accelerating the training and improving the
generalization of deep neural networks (DNNs), and have successfully been used in various …
generalization of deep neural networks (DNNs), and have successfully been used in various …
Resurrecting recurrent neural networks for long sequences
Abstract Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …
Parameterized quantum circuits as machine learning models
Hybrid quantum–classical systems make it possible to utilize existing quantum computers to
their fullest extent. Within this framework, parameterized quantum circuits can be regarded …
their fullest extent. Within this framework, parameterized quantum circuits can be regarded …
An empirical evaluation of generic convolutional and recurrent networks for sequence modeling
For most deep learning practitioners, sequence modeling is synonymous with recurrent
networks. Yet recent results indicate that convolutional architectures can outperform …
networks. Yet recent results indicate that convolutional architectures can outperform …
Concept whitening for interpretable image recognition
What does a neural network encode about a concept as we traverse through the layers?
Interpretability in machine learning is undoubtedly important, but the calculations of neural …
Interpretability in machine learning is undoubtedly important, but the calculations of neural …
Independently recurrent neural network (indrnn): Building a longer and deeper rnn
Recurrent neural networks (RNNs) have been widely used for processing sequential data.
However, RNNs are commonly difficult to train due to the well-known gradient vanishing and …
However, RNNs are commonly difficult to train due to the well-known gradient vanishing and …
Regularizing and optimizing LSTM language models
Recurrent neural networks (RNNs), such as long short-term memory networks (LSTMs),
serve as a fundamental building block for many sequence learning tasks, including machine …
serve as a fundamental building block for many sequence learning tasks, including machine …
Deep complex networks
At present, the vast majority of building blocks, techniques, and architectures for deep
learning are based on real-valued operations and representations. However, recent work on …
learning are based on real-valued operations and representations. However, recent work on …
Phase-aware speech enhancement with deep complex u-net
Most deep learning-based models for speech enhancement have mainly focused on
estimating the magnitude of spectrogram while reusing the phase from noisy speech for …
estimating the magnitude of spectrogram while reusing the phase from noisy speech for …