Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[HTML][HTML] From word embeddings to pre-trained language models: A state-of-the-art walkthrough
M Mars - Applied Sciences, 2022 - mdpi.com
With the recent advances in deep learning, different approaches to improving pre-trained
language models (PLMs) have been proposed. PLMs have advanced state-of-the-art …
language models (PLMs) have been proposed. PLMs have advanced state-of-the-art …
A survey on transformer compression
Transformer plays a vital role in the realms of natural language processing (NLP) and
computer vision (CV), specially for constructing large language models (LLM) and large …
computer vision (CV), specially for constructing large language models (LLM) and large …
Krona: Parameter efficient tuning with kronecker adapter
Fine-tuning a Pre-trained Language Model (PLM) on a specific downstream task has been a
well-known paradigm in Natural Language Processing. However, with the ever-growing size …
well-known paradigm in Natural Language Processing. However, with the ever-growing size …
Beyond efficiency: A systematic survey of resource-efficient large language models
The burgeoning field of Large Language Models (LLMs), exemplified by sophisticated
models like OpenAI's ChatGPT, represents a significant advancement in artificial …
models like OpenAI's ChatGPT, represents a significant advancement in artificial …
Lut-gemm: Quantized matrix multiplication based on luts for efficient inference in large-scale generative language models
Recent advances in self-supervised learning and the Transformer architecture have
significantly improved natural language processing (NLP), achieving remarkably low …
significantly improved natural language processing (NLP), achieving remarkably low …
[HTML][HTML] Information retrieval meets large language models: a strategic report from chinese ir community
The research field of Information Retrieval (IR) has evolved significantly, expanding beyond
traditional search to meet diverse user information needs. Recently, Large Language …
traditional search to meet diverse user information needs. Recently, Large Language …
Parameter-efficient model adaptation for vision transformers
In computer vision, it has achieved great transfer learning performance via adapting large-
scale pretrained vision models (eg, vision transformers) to downstream tasks. Common …
scale pretrained vision models (eg, vision transformers) to downstream tasks. Common …
Compression of generative pre-trained language models via quantization
The increasing size of generative Pre-trained Language Models (PLMs) has greatly
increased the demand for model compression. Despite various methods to compress BERT …
increased the demand for model compression. Despite various methods to compress BERT …
A survey on model compression and acceleration for pretrained language models
Despite achieving state-of-the-art performance on many NLP tasks, the high energy cost and
long inference delay prevent Transformer-based pretrained language models (PLMs) from …
long inference delay prevent Transformer-based pretrained language models (PLMs) from …
What matters in the structured pruning of generative language models?
Auto-regressive large language models such as GPT-3 require enormous computational
resources to use. Traditionally, structured pruning methods are employed to reduce …
resources to use. Traditionally, structured pruning methods are employed to reduce …