Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Generalization analysis of machine learning algorithms via the worst-case data-generating probability measure
In this paper, the worst-case probability measure over the data is introduced as a tool for
characterizing the generalization capabilities of machine learning algorithms. More …
characterizing the generalization capabilities of machine learning algorithms. More …
Modeling stationary, periodic, and long memory processes by superposed jump-driven processes
H Yoshioka - Chaos, Solitons & Fractals, 2024 - Elsevier
The long memory process is a stochastic process with power-type autocorrelation. Such
processes are found worldwide, and those arising in the environmental sciences often have …
processes are found worldwide, and those arising in the environmental sciences often have …
The worst-case data-generating probability measure in statistical learning
The worst-case data-generating (WCDG) probability measure is introduced as a tool for
characterizing the generalization capabilities of machine learning algorithms. Such a WCDG …
characterizing the generalization capabilities of machine learning algorithms. Such a WCDG …
On the validation of Gibbs algorithms: Training datasets, test datasets and their aggregation
The dependence on training data of the Gibbs algorithm (GA) is analytically characterized.
By adopting the expected empirical risk as the performance metric, the sensitivity of the GA …
By adopting the expected empirical risk as the performance metric, the sensitivity of the GA …
Information-theoretic characterizations of generalization error for the Gibbs algorithm
Various approaches have been developed to upper bound the generalization error of a
supervised learning algorithm. However, existing bounds are often loose and even vacuous …
supervised learning algorithm. However, existing bounds are often loose and even vacuous …
The generalization error of machine learning algorithms
In this paper, the method of gaps, a technique for deriving closed-form expressions in terms
of information measures for the generalization error of machine learning algorithms is …
of information measures for the generalization error of machine learning algorithms is …
Empirical risk minimization with relative entropy regularization: Optimality and sensitivity analysis
The optimality and sensitivity of the empirical risk minimization problem with relative entropy
regularization (ERM-RER) are investigated for the case in which the reference is a σ-finite …
regularization (ERM-RER) are investigated for the case in which the reference is a σ-finite …
Empirical risk minimization with f-divergence regularization in statistical learning
This report presents the solution to the empirical risk minimization with $ f $-divergence
regularization, under mild conditions on $ f $. Under such conditions, the optimal measure is …
regularization, under mild conditions on $ f $. Under such conditions, the optimal measure is …
Asymmetry of the relative entropy in the regularization of empirical risk minimization
The effect of relative entropy asymmetry is analyzed in the context of empirical risk
minimization (ERM) with relative entropy regularization (ERM-RER). Two regularizations are …
minimization (ERM) with relative entropy regularization (ERM-RER). Two regularizations are …
Empirical risk minimization with relative entropy regularization type-II
The effect of the relative entropy asymmetry is analyzed in the empirical risk minimization
with relative entropy regularization (ERM-RER) problem. A novel regularization is …
with relative entropy regularization (ERM-RER) problem. A novel regularization is …