Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Diagnosis for multiple faults of chiller using ELM-KNN model enhanced by multi-label learning and specific feature combinations
P Li, Z Liu, B Anduv, X Zhu, X **, Z Du - Building and Environment, 2022 - Elsevier
Existing fault detection and diagnosis methods for chillers are usually very effective on
single faults diagnosis, but perform poorly while diagnosing multiple faults, and these fault …
single faults diagnosis, but perform poorly while diagnosing multiple faults, and these fault …
Dynamic k determination in k-NN classifier: A literature review
M Papanikolaou, G Evangelidis… - … & Applications (IISA), 2021 - ieeexplore.ieee.org
One of the widely used classification algorithms is k-Nearest Neighbours (k-NN). Its
popularity is mainly due to its simplicity, effectiveness, ease of implementation and ability to …
popularity is mainly due to its simplicity, effectiveness, ease of implementation and ability to …
Clustering evaluation in high-dimensional data
Clustering evaluation plays an important role in unsupervised learning systems, as it is often
necessary to automatically quantify the quality of generated cluster configurations. This is …
necessary to automatically quantify the quality of generated cluster configurations. This is …
Class imbalance and the curse of minority hubs
Most machine learning tasks involve learning from high-dimensional data, which is often
quite difficult to handle. Hubness is an aspect of the curse of dimensionality that was shown …
quite difficult to handle. Hubness is an aspect of the curse of dimensionality that was shown …
Feature and instance reduction for PNN classifiers based on fuzzy rough sets
ECC Tsang, Q Hu, D Chen - … Journal of Machine Learning and Cybernetics, 2016 - Springer
Instance reduction for K-nearest-neighbor classification rules (KNN) has attracted much
attention these years, and most of the existing approaches lose the semantics of probability …
attention these years, and most of the existing approaches lose the semantics of probability …
Hubness-aware classification, instance selection and feature construction: Survey and extensions to time-series
Time-series classification is the common denominator in many real-world pattern recognition
tasks. In the last decade, the simple nearest neighbor classifier, in combination with dynamic …
tasks. In the last decade, the simple nearest neighbor classifier, in combination with dynamic …
GPU-SME-kNN: Scalable and memory efficient kNN and lazy learning using GPUs
The k nearest neighbor (k NN) rule is one of the most used techniques in data mining and
pattern recognition due to its simplicity and low identification error. However, the …
pattern recognition due to its simplicity and low identification error. However, the …
Hubness-aware kNN classification of high-dimensional data in presence of label noise
Learning with label noise is an important issue in classification, since it is not always
possible to obtain reliable data labels. In this paper we explore and evaluate a new …
possible to obtain reliable data labels. In this paper we explore and evaluate a new …
High dimensional nearest neighbor classification based on mean absolute differences of inter-point distances
Traditional nearest neighbor classifiers based on usual distance functions (eg, Euclidean
distance) often suffer in high dimension low sample size (HDLSS) situations, where …
distance) often suffer in high dimension low sample size (HDLSS) situations, where …
Fine-grained document clustering via ranking and its application to social media analytics
Extracting valuable insights from a large volume of unstructured data such as texts through
clustering analysis is paramount to many big data applications. However, document …
clustering analysis is paramount to many big data applications. However, document …