Feature-wise attention based boosting ensemble method for fraud detection

R Cao, J Wang, M Mao, G Liu, C Jiang - Engineering Applications of …, 2023 - Elsevier
Transaction fraud detection is an essential topic in financial research, protecting customers
and financial institutions from suffering significant financial losses. The existing ensemble …

Optimal weak to strong learning

K Green Larsen, M Ritzert - Advances in Neural Information …, 2022 - proceedings.neurips.cc
The classic algorithm AdaBoost allows to convert a weak learner, that is an algorithm that
produces a hypothesis which is slightly better than chance, into a strong learner, achieving …

Near-tight margin-based generalization bounds for support vector machines

A Grønlund, L Kamma… - … Conference on Machine …, 2020 - proceedings.mlr.press
Abstract Support Vector Machines (SVMs) are among the most fundamental tools for binary
classification. In its simplest formulation, an SVM produces a hyperplane separating two …

[PDF][PDF] Partial Multi-Label Optimal Margin Distribution Machine.

N Cao, T Zhang, H ** - IJCAI, 2021 - ijcai.org
Partial multi-label learning deals with the circumstance in which the ground-truth labels are
not directly available but hidden in a candidate label set. Due to the presence of other …

[HTML][HTML] Population risk improvement with model compression: An information-theoretic approach

Y Bu, W Gao, S Zou, VV Veeravalli - Entropy, 2021 - mdpi.com
It has been reported in many recent works on deep model compression that the population
risk of a compressed model can be even better than that of the original model. In this paper …

Adaboost is not an optimal weak to strong learner

MM Høgsgaard, KG Larsen… - … Conference on Machine …, 2023 - proceedings.mlr.press
AdaBoost is a classic boosting algorithm for combining multiple inaccurate classifiers
produced by a weak learner, to produce a strong learner with arbitrarily high accuracy when …

Margins are insufficient for explaining gradient boosting

A Grønlund, L Kamma… - Advances in Neural …, 2020 - proceedings.neurips.cc
Boosting is one of the most successful ideas in machine learning, achieving great practical
performance with little fine-tuning. The success of boosted classifiers is most often attributed …

Multi-objective evolutionary ensemble pruning guided by margin distribution

YC Wu, YX He, C Qian, ZH Zhou - … on Parallel Problem Solving from Nature, 2022 - Springer
Ensemble learning trains and combines multiple base learners for a single learning task,
and has been among the state-of-the-art learning techniques. Ensemble pruning tries to …

The impossibility of parallelizing boosting

A Karbasi, KG Larsen - International Conference on …, 2024 - proceedings.mlr.press
The aim of boosting is to convert a sequence of weak learners into a strong learner. At their
heart, these methods are fully sequential. In this paper, we investigate the possibility of …

Improving generalization of deep neural networks by leveraging margin distribution

SH Lyu, L Wang, ZH Zhou - Neural Networks, 2022 - Elsevier
Recent research has used margin theory to analyze the generalization performance for
deep neural networks (DNNs). The existed results are almost based on the spectrally …