Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Document-level machine translation with large language models
Large language models (LLMs) such as ChatGPT can produce coherent, cohesive, relevant,
and fluent answers for various natural language processing (NLP) tasks. Taking document …
and fluent answers for various natural language processing (NLP) tasks. Taking document …
Incremental transformer structure enhanced image inpainting with masking positional encoding
Image inpainting has made significant advances in recent years. However, it is still
challenging to recover corrupted images with both vivid textures and reasonable structures …
challenging to recover corrupted images with both vivid textures and reasonable structures …
Break the sequential dependency of llm inference using lookahead decoding
Autoregressive decoding of large language models (LLMs) is memory bandwidth bounded,
resulting in high latency and significant wastes of the parallel processing power of modern …
resulting in high latency and significant wastes of the parallel processing power of modern …
On the effectiveness of adapter-based tuning for pretrained language model adaptation
Adapter-based tuning has recently arisen as an alternative to fine-tuning. It works by adding
light-weight adapter modules to a pretrained language model (PrLM) and only updating the …
light-weight adapter modules to a pretrained language model (PrLM) and only updating the …
A survey on non-autoregressive generation for neural machine translation and beyond
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …
(NMT) to speed up inference, has attracted much attention in both machine learning and …
Diffusion language models are versatile protein learners
This paper introduces diffusion protein language model (DPLM), a versatile protein
language model that demonstrates strong generative and predictive capabilities for protein …
language model that demonstrates strong generative and predictive capabilities for protein …
UFC-BERT: Unifying multi-modal controls for conditional image synthesis
Conditional image synthesis aims to create an image according to some multi-modal
guidance in the forms of textual descriptions, reference images, and image blocks to …
guidance in the forms of textual descriptions, reference images, and image blocks to …
How to design translation prompts for ChatGPT: An empirical study
ChatGPT, a chatbot based on the GPT models, has demonstrated surprising abilities in
natural language understanding and generation tasks. Given that machine translation …
natural language understanding and generation tasks. Given that machine translation …
Fast nearest neighbor machine translation
Though nearest neighbor Machine Translation ($ k $ NN-MT)\citep {khandelwal2020nearest
} has proved to introduce significant performance boosts over standard neural MT systems, it …
} has proved to introduce significant performance boosts over standard neural MT systems, it …
MSP: Multi-stage prompting for making pre-trained language models better translators
Prompting has recently been shown as a promising approach for applying pre-trained
language models to perform downstream tasks. We present Multi-Stage Prompting (MSP), a …
language models to perform downstream tasks. We present Multi-Stage Prompting (MSP), a …