Cross-View Consistency Regularisation for Knowledge Distillation
Knowledge distillation (KD) is an established paradigm for transferring privileged knowledge
from a cumbersome model to a more lightweight and efficient one. In recent years, logit …
from a cumbersome model to a more lightweight and efficient one. In recent years, logit …
Semantic Distillation from Neighborhood for Composed Image Retrieval
Y Wang, W Huang, L Li, C Yuan - Proceedings of the 32nd ACM …, 2024 - dl.acm.org
The challenging task composed image retrieval targets at identifying the matched image
from the multi-modal query with a reference image and a textual modifier. Most existing …
from the multi-modal query with a reference image and a textual modifier. Most existing …
[HTML][HTML] D3-YOLOv10: Improved YOLOv10-Based Lightweight Tomato Detection Algorithm Under Facility Scenario
A Li, C Wang, T Ji, Q Wang, T Zhang - Agriculture, 2024 - mdpi.com
Accurate and efficient tomato detection is one of the key techniques for intelligent automatic
picking in the area of precision agriculture. However, under the facility scenario, existing …
picking in the area of precision agriculture. However, under the facility scenario, existing …