On inter-dataset code duplication and data leakage in large language models
Motivation. Large language models (LLMs) have exhibited remarkable proficiency in diverse
software engineering (SE) tasks, such as code summarization, code translation, and code …
software engineering (SE) tasks, such as code summarization, code translation, and code …
A class-incremental learning approach for learning feature-compatible embeddings
Humans have the ability to constantly learn new knowledge. However, for artificial
intelligence, trying to continuously learn new knowledge usually results in catastrophic …
intelligence, trying to continuously learn new knowledge usually results in catastrophic …
Comprehensive Sensitivity Analysis Framework for Transfer Learning Performance Assessment for Time Series Forecasting: Basic Concepts and Selected Case …
Recently, transfer learning has gained popularity in the machine learning community.
Transfer Learning (TL) has emerged as a promising paradigm that leverages knowledge …
Transfer Learning (TL) has emerged as a promising paradigm that leverages knowledge …
A dynamic routing CapsNet based on increment prototype clustering for overcoming catastrophic forgetting
M Wang, Z Guo, H Li - IET Computer Vision, 2022 - Wiley Online Library
In continual learning, previously learnt knowledge tends to be overlapped by the
subsequent training tasks. This bottleneck, known as catastrophic forgetting, has recently …
subsequent training tasks. This bottleneck, known as catastrophic forgetting, has recently …
Incremental Domain Learning for Surface Quality Inspection of Automotive High Voltage Battery
M Shirazi, G Safronov, A Risk - 2023 International Conference …, 2023 - ieeexplore.ieee.org
Catastrophic forgetting refers to a neural network's detrimental loss of previously learned
information upon acquiring new knowledge. Recent continual learning methodologies grant …
information upon acquiring new knowledge. Recent continual learning methodologies grant …
Multi-task continuous learning model
Z Guo, M Wang - Journal of Physics: Conference Series, 2021 - iopscience.iop.org
In continual learning, previous learned knowledge tends to be overlapped by the
subsequent training tasks. This bottleneck, known as catastrophic forgetting (CF), has …
subsequent training tasks. This bottleneck, known as catastrophic forgetting (CF), has …
A quantitative analysis of how the Variational Continual Learning method handles catastrophic forgetting
C Larsen, E Ryman - 2020 - diva-portal.org
Catastrophic forgetting is a problem that occurs when an artificial neural network in the
continual learning setting replaces historic information as additional information is acquired …
continual learning setting replaces historic information as additional information is acquired …