Liquid structural state-space models

R Hasani, M Lechner, TH Wang, M Chahine… - arxiv preprint arxiv …, 2022 - arxiv.org
A proper parametrization of state transition matrices of linear state-space models (SSMs)
followed by standard nonlinearities enables them to efficiently learn representations from …

Learning long-term dependencies in irregularly-sampled time series

M Lechner, R Hasani - arxiv preprint arxiv:2006.04418, 2020 - arxiv.org
Recurrent neural networks (RNNs) with continuous-time hidden states are a natural fit for
modeling irregularly-sampled time series. These models, however, face difficulties when the …

Scene graph semantic inference for image and text matching

J Pei, K Zhong, Z Yu, L Wang… - ACM Transactions on …, 2023 - dl.acm.org
With the rapid development of information technology, image and text data have increased
dramatically. Image and text matching techniques enable computers to understand …

An optimized deep learning approach for detecting fraudulent transactions

S El Kafhali, M Tayebi, H Sulimani - Information, 2024 - mdpi.com
The proliferation of new technologies and advancements in existing ones are altering our
perspective of the world. So, continuous improvements are needed. A connected world filled …

Inductive synthesis of finite-state controllers for POMDPs

R Andriushchenko, M Češka… - Uncertainty in …, 2022 - proceedings.mlr.press
We present a novel learning framework to obtain finite-state controllers (FSCs) for partially
observable Markov decision processes and illustrate its applicability for indefinite-horizon …

Deep learning for volatility forecasting in asset management

A Petrozziello, L Troiano, A Serra, I Jordanov, G Storti… - Soft Computing, 2022 - Springer
Predicting volatility is a critical activity for taking risk-adjusted decisions in asset trading and
allocation. In order to provide effective decision-making support, in this paper we investigate …

[PDF][PDF] On interpretability of artificial neural networks

F Fan, J **ong, G Wang - arxiv preprint arxiv:2001.02522, 2020 - researchgate.net
Deep learning has achieved great successes in many important areas to dealing with text,
images, video, graphs, and so on. However, the black-box nature of deep artificial neural …

Weighted automata extraction from recurrent neural networks via regression on state spaces

T Okudono, M Waga, T Sekiyama… - Proceedings of the AAAI …, 2020 - ojs.aaai.org
We present a method to extract a weighted finite automaton (WFA) from a recurrent neural
network (RNN). Our method is based on the WFA learning algorithm by Balle and Mohri …

Transformer uncertainty estimation with hierarchical stochastic attention

J Pei, C Wang, G Szarvas - Proceedings of the AAAI Conference on …, 2022 - ojs.aaai.org
Transformers are state-of-the-art in a wide range of NLP tasks and have also been applied
to many real-world products. Understanding the reliability and certainty of transformer …

On the computational complexity and formal hierarchy of second order recurrent neural networks

A Mali, A Ororbia, D Kifer, L Giles - arxiv preprint arxiv:2309.14691, 2023 - arxiv.org
Artificial neural networks (ANNs) with recurrence and self-attention have been shown to be
Turing-complete (TC). However, existing work has shown that these ANNs require multiple …