Explainable artificial intelligence: a comprehensive review
Thanks to the exponential growth in computing power and vast amounts of data, artificial
intelligence (AI) has witnessed remarkable developments in recent years, enabling it to be …
intelligence (AI) has witnessed remarkable developments in recent years, enabling it to be …
A survey of deep active learning
Active learning (AL) attempts to maximize a model's performance gain while annotating the
fewest samples possible. Deep learning (DL) is greedy for data and requires a large amount …
fewest samples possible. Deep learning (DL) is greedy for data and requires a large amount …
[HTML][HTML] Explainable Artificial Intelligence (XAI): What we know and what is left to attain Trustworthy Artificial Intelligence
Artificial intelligence (AI) is currently being utilized in a wide range of sophisticated
applications, but the outcomes of many AI models are challenging to comprehend and trust …
applications, but the outcomes of many AI models are challenging to comprehend and trust …
Gcr: Gradient coreset based replay buffer selection for continual learning
Continual learning (CL) aims to develop techniques by which a single model adapts to an
increasing number of tasks encountered sequentially, thereby potentially leveraging …
increasing number of tasks encountered sequentially, thereby potentially leveraging …
Selection via proxy: Efficient data selection for deep learning
Data selection methods, such as active learning and core-set selection, are useful tools for
machine learning on large datasets. However, they can be prohibitively expensive to apply …
machine learning on large datasets. However, they can be prohibitively expensive to apply …
Coresets via bilevel optimization for continual learning and streaming
Coresets are small data summaries that are sufficient for model training. They can be
maintained online, enabling efficient handling of large data streams under resource …
maintained online, enabling efficient handling of large data streams under resource …
Optimal experimental design: Formulations and computations
Questions of 'how best to acquire data'are essential to modelling and prediction in the
natural and social sciences, engineering applications, and beyond. Optimal experimental …
natural and social sciences, engineering applications, and beyond. Optimal experimental …
Random features for kernel approximation: A survey on algorithms, theory, and beyond
The class of random features is one of the most popular techniques to speed up kernel
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …
methods in large-scale problems. Related works have been recognized by the NeurIPS Test …
Bayesian batch active learning as sparse subset approximation
Leveraging the wealth of unlabeled data produced in recent years provides great potential
for improving supervised models. When the cost of acquiring labels is high, probabilistic …
for improving supervised models. When the cost of acquiring labels is high, probabilistic …
Can Public Large Language Models Help Private Cross-device Federated Learning?
We study (differentially) private federated learning (FL) of language models. The language
models in cross-device FL are relatively small, which can be trained with meaningful formal …
models in cross-device FL are relatively small, which can be trained with meaningful formal …