Data-efficient Fine-tuning for LLM-based Recommendation
Leveraging Large Language Models (LLMs) for recommendation has recently garnered
considerable attention, where fine-tuning plays a key role in LLMs' adaptation. However, the …
considerable attention, where fine-tuning plays a key role in LLMs' adaptation. However, the …
Ideal: Influence-driven selective annotations empower in-context learners in large language models
In-context learning is a promising paradigm that utilizes in-context examples as prompts for
the predictions of large language models. These prompts are crucial for achieving strong …
the predictions of large language models. These prompts are crucial for achieving strong …
Spanning training progress: Temporal dual-depth scoring (tdds) for enhanced dataset pruning
Dataset pruning aims to construct a coreset capable of achieving performance comparable
to the original full dataset. Most existing dataset pruning methods rely on snapshot-based …
to the original full dataset. Most existing dataset pruning methods rely on snapshot-based …
Refined Coreset Selection: Towards Minimal Coreset Size under Model Performance Constraints
Coreset selection is powerful in reducing computational costs and accelerating data
processing for deep learning algorithms. It strives to identify a small subset from large-scale …
processing for deep learning algorithms. It strives to identify a small subset from large-scale …
Coreset selection with prioritized multiple objectives
Coreset selection is powerful in reducing computational costs and accelerating data
processing for deep learning algorithms. It strives to identify a small subset from large-scale …
processing for deep learning algorithms. It strives to identify a small subset from large-scale …
Mind the Boundary: Coreset Selection via Reconstructing the Decision Boundary
Existing paradigms of pushing the state of the art require exponentially more training data in
many fields. Coreset selection seeks to mitigate this growing demand by identifying the most …
many fields. Coreset selection seeks to mitigate this growing demand by identifying the most …
Effective pruning of web-scale datasets based on complexity of concept clusters
Utilizing massive web-scale datasets has led to unprecedented performance gains in
machine learning models, but also imposes outlandish compute requirements for their …
machine learning models, but also imposes outlandish compute requirements for their …
Efficient architecture search via bi-level data pruning
Improving the efficiency of Neural Architecture Search (NAS) is a challenging but significant
task that has received much attention. Previous studies mainly adopt the Differentiable …
task that has received much attention. Previous studies mainly adopt the Differentiable …
Are Sparse Neural Networks Better Hard Sample Learners?
While deep learning has demonstrated impressive progress, it remains a daunting
challenge to learn from hard samples as these samples are usually noisy and intricate …
challenge to learn from hard samples as these samples are usually noisy and intricate …
Dynamic data pruning for automatic speech recognition
The recent success of Automatic Speech Recognition (ASR) is largely attributed to the ever-
growing amount of training data. However, this trend has made model training prohibitively …
growing amount of training data. However, this trend has made model training prohibitively …