Conversational agents in therapeutic interventions for neurodevelopmental disorders: a survey

F Catania, M Spitale, F Garzotto - ACM Computing Surveys, 2023 - dl.acm.org
Neurodevelopmental Disorders (NDD) are a group of conditions with onset in the
developmental period characterized by deficits in the cognitive and social areas …

Neural machine translation: A review

F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …

S2-mlp: Spatial-shift mlp architecture for vision

T Yu, X Li, Y Cai, M Sun, P Li - Proceedings of the IEEE/CVF …, 2022 - openaccess.thecvf.com
Abstract Recently, visual Transformer (ViT) and its following works abandon the convolution
and exploit the self-attention operation, attaining a comparable or even higher accuracy than …

Conv-tasnet: Surpassing ideal time–frequency magnitude masking for speech separation

Y Luo, N Mesgarani - IEEE/ACM transactions on audio, speech …, 2019 - ieeexplore.ieee.org
Single-channel, speaker-independent speech separation methods have recently seen great
progress. However, the accuracy, latency, and computational cost of such methods remain …

Learning discriminative features by covering local geometric space for point cloud analysis

C Wang, X Ning, L Sun, L Zhang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
At present, effectively aggregating and transferring the local features of point cloud is still an
unresolved technological conundrum. In this study, we propose a new space-cover …

Universal transformers

M Dehghani, S Gouws, O Vinyals, J Uszkoreit… - arxiv preprint arxiv …, 2018 - arxiv.org
Recurrent neural networks (RNNs) sequentially process data by updating their state with
each new data point, and have long been the de facto choice for sequence modeling tasks …

Qanet: Combining local convolution with global self-attention for reading comprehension

AW Yu, D Dohan, MT Luong, R Zhao, K Chen… - arxiv preprint arxiv …, 2018 - arxiv.org
Current end-to-end machine reading and question answering (Q\&A) models are primarily
based on recurrent neural networks (RNNs) with attention. Despite their success, these …

Non-autoregressive neural machine translation

J Gu, J Bradbury, C **ong, VOK Li, R Socher - arxiv preprint arxiv …, 2017 - arxiv.org
Existing approaches to neural machine translation condition each output word on previously
generated outputs. We introduce a model that avoids this autoregressive property and …

Lite transformer with long-short range attention

Z Wu, Z Liu, J Lin, Y Lin, S Han - arxiv preprint arxiv:2004.11886, 2020 - arxiv.org
Transformer has become ubiquitous in natural language processing (eg, machine
translation, question answering); however, it requires enormous amount of computations to …

Tensor2tensor for neural machine translation

A Vaswani, S Bengio, E Brevdo, F Chollet… - arxiv preprint arxiv …, 2018 - arxiv.org
arxiv:1803.07416v1 [cs.LG] 16 Mar 2018 Page 1 Tensor2Tensor for Neural Machine Translation
Ashish Vaswani1, Samy Bengio1, Eugene Brevdo1, Francois Chollet1, Aidan N. Gomez1 …