Conversational agents in therapeutic interventions for neurodevelopmental disorders: a survey
Neurodevelopmental Disorders (NDD) are a group of conditions with onset in the
developmental period characterized by deficits in the cognitive and social areas …
developmental period characterized by deficits in the cognitive and social areas …
Neural machine translation: A review
F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …
natural language into another, has experienced a major paradigm shift in recent years …
S2-mlp: Spatial-shift mlp architecture for vision
Abstract Recently, visual Transformer (ViT) and its following works abandon the convolution
and exploit the self-attention operation, attaining a comparable or even higher accuracy than …
and exploit the self-attention operation, attaining a comparable or even higher accuracy than …
Conv-tasnet: Surpassing ideal time–frequency magnitude masking for speech separation
Single-channel, speaker-independent speech separation methods have recently seen great
progress. However, the accuracy, latency, and computational cost of such methods remain …
progress. However, the accuracy, latency, and computational cost of such methods remain …
Learning discriminative features by covering local geometric space for point cloud analysis
At present, effectively aggregating and transferring the local features of point cloud is still an
unresolved technological conundrum. In this study, we propose a new space-cover …
unresolved technological conundrum. In this study, we propose a new space-cover …
Universal transformers
Recurrent neural networks (RNNs) sequentially process data by updating their state with
each new data point, and have long been the de facto choice for sequence modeling tasks …
each new data point, and have long been the de facto choice for sequence modeling tasks …
Qanet: Combining local convolution with global self-attention for reading comprehension
Current end-to-end machine reading and question answering (Q\&A) models are primarily
based on recurrent neural networks (RNNs) with attention. Despite their success, these …
based on recurrent neural networks (RNNs) with attention. Despite their success, these …
Non-autoregressive neural machine translation
Existing approaches to neural machine translation condition each output word on previously
generated outputs. We introduce a model that avoids this autoregressive property and …
generated outputs. We introduce a model that avoids this autoregressive property and …
Lite transformer with long-short range attention
Transformer has become ubiquitous in natural language processing (eg, machine
translation, question answering); however, it requires enormous amount of computations to …
translation, question answering); however, it requires enormous amount of computations to …
Tensor2tensor for neural machine translation
arxiv:1803.07416v1 [cs.LG] 16 Mar 2018 Page 1 Tensor2Tensor for Neural Machine Translation
Ashish Vaswani1, Samy Bengio1, Eugene Brevdo1, Francois Chollet1, Aidan N. Gomez1 …
Ashish Vaswani1, Samy Bengio1, Eugene Brevdo1, Francois Chollet1, Aidan N. Gomez1 …