Explainable artificial intelligence: a comprehensive review

D Minh, HX Wang, YF Li, TN Nguyen - Artificial Intelligence Review, 2022 - Springer
Thanks to the exponential growth in computing power and vast amounts of data, artificial
intelligence (AI) has witnessed remarkable developments in recent years, enabling it to be …

A survey of deep active learning

P Ren, Y **ao, X Chang, PY Huang, Z Li… - ACM computing …, 2021 - dl.acm.org
Active learning (AL) attempts to maximize a model's performance gain while annotating the
fewest samples possible. Deep learning (DL) is greedy for data and requires a large amount …

[HTML][HTML] Explainable Artificial Intelligence (XAI): What we know and what is left to attain Trustworthy Artificial Intelligence

S Ali, T Abuhmed, S El-Sappagh, K Muhammad… - Information fusion, 2023 - Elsevier
Artificial intelligence (AI) is currently being utilized in a wide range of sophisticated
applications, but the outcomes of many AI models are challenging to comprehend and trust …

Gcr: Gradient coreset based replay buffer selection for continual learning

R Tiwari, K Killamsetty, R Iyer… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual learning (CL) aims to develop techniques by which a single model adapts to an
increasing number of tasks encountered sequentially, thereby potentially leveraging …

Selection via proxy: Efficient data selection for deep learning

C Coleman, C Yeh, S Mussmann… - arxiv preprint arxiv …, 2019 - arxiv.org
Data selection methods, such as active learning and core-set selection, are useful tools for
machine learning on large datasets. However, they can be prohibitively expensive to apply …

Coresets via bilevel optimization for continual learning and streaming

Z Borsos, M Mutny, A Krause - Advances in neural …, 2020 - proceedings.neurips.cc
Coresets are small data summaries that are sufficient for model training. They can be
maintained online, enabling efficient handling of large data streams under resource …

Optimal experimental design: Formulations and computations

X Huan, J Jagalur, Y Marzouk - Acta Numerica, 2024 - cambridge.org
Questions of 'how best to acquire data'are essential to modelling and prediction in the
natural and social sciences, engineering applications, and beyond. Optimal experimental …

Random features for kernel approximation: A survey on algorithms, theory, and beyond

F Liu, X Huang, Y Chen… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
The class of random features is one of the most popular techniques to speed up kernel
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …

Bayesian batch active learning as sparse subset approximation

R Pinsler, J Gordon, E Nalisnick… - Advances in neural …, 2019 - proceedings.neurips.cc
Leveraging the wealth of unlabeled data produced in recent years provides great potential
for improving supervised models. When the cost of acquiring labels is high, probabilistic …

Can Public Large Language Models Help Private Cross-device Federated Learning?

B Wang, YJ Zhang, Y Cao, B Li, HB McMahan… - arxiv preprint arxiv …, 2023 - arxiv.org
We study (differentially) private federated learning (FL) of language models. The language
models in cross-device FL are relatively small, which can be trained with meaningful formal …