Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Feature-wise attention based boosting ensemble method for fraud detection
R Cao, J Wang, M Mao, G Liu, C Jiang - Engineering Applications of …, 2023 - Elsevier
Transaction fraud detection is an essential topic in financial research, protecting customers
and financial institutions from suffering significant financial losses. The existing ensemble …
and financial institutions from suffering significant financial losses. The existing ensemble …
Optimal weak to strong learning
K Green Larsen, M Ritzert - Advances in Neural Information …, 2022 - proceedings.neurips.cc
The classic algorithm AdaBoost allows to convert a weak learner, that is an algorithm that
produces a hypothesis which is slightly better than chance, into a strong learner, achieving …
produces a hypothesis which is slightly better than chance, into a strong learner, achieving …
Near-tight margin-based generalization bounds for support vector machines
A Grønlund, L Kamma… - … Conference on Machine …, 2020 - proceedings.mlr.press
Abstract Support Vector Machines (SVMs) are among the most fundamental tools for binary
classification. In its simplest formulation, an SVM produces a hyperplane separating two …
classification. In its simplest formulation, an SVM produces a hyperplane separating two …
[PDF][PDF] Partial Multi-Label Optimal Margin Distribution Machine.
Partial multi-label learning deals with the circumstance in which the ground-truth labels are
not directly available but hidden in a candidate label set. Due to the presence of other …
not directly available but hidden in a candidate label set. Due to the presence of other …
[HTML][HTML] Population risk improvement with model compression: An information-theoretic approach
It has been reported in many recent works on deep model compression that the population
risk of a compressed model can be even better than that of the original model. In this paper …
risk of a compressed model can be even better than that of the original model. In this paper …
Adaboost is not an optimal weak to strong learner
MM Høgsgaard, KG Larsen… - … Conference on Machine …, 2023 - proceedings.mlr.press
AdaBoost is a classic boosting algorithm for combining multiple inaccurate classifiers
produced by a weak learner, to produce a strong learner with arbitrarily high accuracy when …
produced by a weak learner, to produce a strong learner with arbitrarily high accuracy when …
Margins are insufficient for explaining gradient boosting
A Grønlund, L Kamma… - Advances in Neural …, 2020 - proceedings.neurips.cc
Boosting is one of the most successful ideas in machine learning, achieving great practical
performance with little fine-tuning. The success of boosted classifiers is most often attributed …
performance with little fine-tuning. The success of boosted classifiers is most often attributed …
Multi-objective evolutionary ensemble pruning guided by margin distribution
Ensemble learning trains and combines multiple base learners for a single learning task,
and has been among the state-of-the-art learning techniques. Ensemble pruning tries to …
and has been among the state-of-the-art learning techniques. Ensemble pruning tries to …
The impossibility of parallelizing boosting
The aim of boosting is to convert a sequence of weak learners into a strong learner. At their
heart, these methods are fully sequential. In this paper, we investigate the possibility of …
heart, these methods are fully sequential. In this paper, we investigate the possibility of …
Improving generalization of deep neural networks by leveraging margin distribution
Recent research has used margin theory to analyze the generalization performance for
deep neural networks (DNNs). The existed results are almost based on the spectrally …
deep neural networks (DNNs). The existed results are almost based on the spectrally …