Diagnosis for multiple faults of chiller using ELM-KNN model enhanced by multi-label learning and specific feature combinations

P Li, Z Liu, B Anduv, X Zhu, X **, Z Du - Building and Environment, 2022 - Elsevier
Existing fault detection and diagnosis methods for chillers are usually very effective on
single faults diagnosis, but perform poorly while diagnosing multiple faults, and these fault …

Dynamic k determination in k-NN classifier: A literature review

M Papanikolaou, G Evangelidis… - … & Applications (IISA), 2021 - ieeexplore.ieee.org
One of the widely used classification algorithms is k-Nearest Neighbours (k-NN). Its
popularity is mainly due to its simplicity, effectiveness, ease of implementation and ability to …

Clustering evaluation in high-dimensional data

N Tomašev, M Radovanović - Unsupervised learning algorithms, 2016 - Springer
Clustering evaluation plays an important role in unsupervised learning systems, as it is often
necessary to automatically quantify the quality of generated cluster configurations. This is …

Class imbalance and the curse of minority hubs

N Tomašev, D Mladenić - Knowledge-Based Systems, 2013 - Elsevier
Most machine learning tasks involve learning from high-dimensional data, which is often
quite difficult to handle. Hubness is an aspect of the curse of dimensionality that was shown …

Feature and instance reduction for PNN classifiers based on fuzzy rough sets

ECC Tsang, Q Hu, D Chen - … Journal of Machine Learning and Cybernetics, 2016 - Springer
Instance reduction for K-nearest-neighbor classification rules (KNN) has attracted much
attention these years, and most of the existing approaches lose the semantics of probability …

Hubness-aware classification, instance selection and feature construction: Survey and extensions to time-series

N Tomašev, K Buza, K Marussy, PB Kis - Feature selection for data and …, 2015 - Springer
Time-series classification is the common denominator in many real-world pattern recognition
tasks. In the last decade, the simple nearest neighbor classifier, in combination with dynamic …

GPU-SME-kNN: Scalable and memory efficient kNN and lazy learning using GPUs

PD Gutiérrez, M Lastra, J Bacardit, JM Benítez… - Information …, 2016 - Elsevier
The k nearest neighbor (k NN) rule is one of the most used techniques in data mining and
pattern recognition due to its simplicity and low identification error. However, the …

Hubness-aware kNN classification of high-dimensional data in presence of label noise

N Tomašev, K Buza - Neurocomputing, 2015 - Elsevier
Learning with label noise is an important issue in classification, since it is not always
possible to obtain reliable data labels. In this paper we explore and evaluate a new …

High dimensional nearest neighbor classification based on mean absolute differences of inter-point distances

AK Pal, PK Mondal, AK Ghosh - Pattern Recognition Letters, 2016 - Elsevier
Traditional nearest neighbor classifiers based on usual distance functions (eg, Euclidean
distance) often suffer in high dimension low sample size (HDLSS) situations, where …

Fine-grained document clustering via ranking and its application to social media analytics

T Sutanto, R Nayak - Social Network Analysis and Mining, 2018 - Springer
Extracting valuable insights from a large volume of unstructured data such as texts through
clustering analysis is paramount to many big data applications. However, document …