Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Parameter-efficient fine-tuning methods for pretrained language models: A critical review and assessment
With the continuous growth in the number of parameters of transformer-based pretrained
language models (PLMs), particularly the emergence of large language models (LLMs) with …
language models (PLMs), particularly the emergence of large language models (LLMs) with …
A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …
massive model sizes that require significant computational and storage resources. To …
Resmlp: Feedforward networks for image classification with data-efficient training
We present ResMLP, an architecture built entirely upon multi-layer perceptrons for image
classification. It is a simple residual network that alternates (i) a linear layer in which image …
classification. It is a simple residual network that alternates (i) a linear layer in which image …
More convnets in the 2020s: Scaling up kernels beyond 51x51 using sparsity
Transformers have quickly shined in the computer vision world since the emergence of
Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) …
Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) …
Machine learning in aerodynamic shape optimization
Abstract Machine learning (ML) has been increasingly used to aid aerodynamic shape
optimization (ASO), thanks to the availability of aerodynamic data and continued …
optimization (ASO), thanks to the availability of aerodynamic data and continued …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Carbon emission prediction models: A review
Y **, A Sharifi, Z Li, S Chen, S Zeng, S Zhao - Science of The Total …, 2024 - Elsevier
Amidst growing concerns over the greenhouse effect, especially its consequential impacts,
establishing effective Carbon Emission Prediction Models (CEPMs) to comprehend and …
establishing effective Carbon Emission Prediction Models (CEPMs) to comprehend and …
[HTML][HTML] Embracing change: Continual learning in deep neural networks
Artificial intelligence research has seen enormous progress over the past few decades, but it
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
predominantly relies on fixed datasets and stationary environments. Continual learning is an …
Pruning neural networks without any data by iteratively conserving synaptic flow
Pruning the parameters of deep neural networks has generated intense interest due to
potential savings in time, memory and energy both during training and at test time. Recent …
potential savings in time, memory and energy both during training and at test time. Recent …
A survey on efficient inference for large language models
Large Language Models (LLMs) have attracted extensive attention due to their remarkable
performance across various tasks. However, the substantial computational and memory …
performance across various tasks. However, the substantial computational and memory …