Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
Current and emerging trends in medical image segmentation with deep learning
In recent years, the segmentation of anatomical or pathological structures using deep
learning has experienced a widespread interest in medical image analysis. Remarkably …
learning has experienced a widespread interest in medical image analysis. Remarkably …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Efficient medical image segmentation based on knowledge distillation
Recent advances have been made in applying convolutional neural networks to achieve
more precise prediction results for medical image segmentation problems. However, the …
more precise prediction results for medical image segmentation problems. However, the …
A sentence speaks a thousand images: Domain generalization through distilling clip with language guidance
Abstract Domain generalization studies the problem of training a model with samples from
several domains (or distributions) and then testing the model with samples from a new …
several domains (or distributions) and then testing the model with samples from a new …
Visual tuning
Fine-tuning visual models has been widely shown promising performance on many
downstream visual tasks. With the surprising development of pre-trained visual foundation …
downstream visual tasks. With the surprising development of pre-trained visual foundation …
Block selection method for using feature norm in out-of-distribution detection
Detecting out-of-distribution (OOD) inputs during the inference stage is crucial for deploying
neural networks in the real world. Previous methods commonly relied on the output of a …
neural networks in the real world. Previous methods commonly relied on the output of a …
Better generative replay for continual federated learning
Federated learning is a technique that enables a centralized server to learn from distributed
clients via communications without accessing the client local data. However, existing …
clients via communications without accessing the client local data. However, existing …
Computation-efficient deep learning for computer vision: A survey
Over the past decade, deep learning models have exhibited considerable advancements,
reaching or even exceeding human-level performance in a range of visual perception tasks …
reaching or even exceeding human-level performance in a range of visual perception tasks …
Collaborative knowledge distillation via multiknowledge transfer
Knowledge distillation (KD), as an efficient and effective model compression technique, has
received considerable attention in deep learning. The key to its success is about transferring …
received considerable attention in deep learning. The key to its success is about transferring …