Sparse invariant risk minimization

X Zhou, Y Lin, W Zhang… - … Conference on Machine …, 2022 - proceedings.mlr.press
Abstract Invariant Risk Minimization (IRM) is an emerging invariant feature extracting
technique to help generalization with distributional shift. However, we find that there exists a …

Domaindrop: Suppressing domain-sensitive channels for domain generalization

J Guo, L Qi, Y Shi - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Abstract Deep Neural Networks have exhibited considerable success in various visual tasks.
However, when applied to unseen test datasets, state-of-the-art models often suffer …

Model agnostic sample reweighting for out-of-distribution learning

X Zhou, Y Lin, R Pi, W Zhang, R Xu… - International …, 2022 - proceedings.mlr.press
Distributionally robust optimization (DRO) and invariant risk minimization (IRM) are two
popular methods proposed to improve out-of-distribution (OOD) generalization performance …

Sparse mixture-of-experts are domain generalizable learners

B Li, Y Shen, J Yang, Y Wang, J Ren, T Che… - arxiv preprint arxiv …, 2022 - arxiv.org
Domain generalization (DG) aims at learning generalizable models under distribution shifts
to avoid redundantly overfitting massive training data. Previous works with complex loss …

Modular design automation of the morphologies, controllers, and vision systems for intelligent robots: a survey

W Li, Z Wang, R Mai, P Ren, Q Zhang, Y Zhou, N Xu… - Visual Intelligence, 2023 - Springer
Abstract Design automation is a core technology in industrial design software and an
important branch of knowledge-worker automation. For example, electronic design …

Graph neural architecture search under distribution shifts

Y Qin, X Wang, Z Zhang, P **e… - … Conference on Machine …, 2022 - proceedings.mlr.press
Graph neural architecture search has shown great potentials for automatically designing
graph neural network (GNN) architectures for graph classification tasks. However, when …

Explore and exploit the diverse knowledge in model zoo for domain generalization

Y Chen, T Hu, F Zhou, Z Li… - … Conference on Machine …, 2023 - proceedings.mlr.press
The proliferation of pretrained models, as a result of advancements in pretraining
techniques, has led to the emergence of a vast zoo of publicly available models. Effectively …

Zood: Exploiting model zoo for out-of-distribution generalization

Q Dong, A Muhammad, F Zhou, C **e… - Advances in …, 2022 - proceedings.neurips.cc
Recent advances on large-scale pre-training have shown great potentials of leveraging a
large set of Pre-Trained Models (PTMs) for improving Out-of-Distribution (OoD) …

Feature-based style randomization for domain generalization

Y Wang, L Qi, Y Shi, Y Gao - … on Circuits and Systems for Video …, 2022 - ieeexplore.ieee.org
As a recent noticeable topic, domain generalization (DG) aims to first learn a generic model
on multiple source domains and then directly generalize to an arbitrary unseen target …

Context-aware robust fine-tuning

X Mao, Y Chen, X Jia, R Zhang, H Xue, Z Li - International Journal of …, 2024 - Springer
Contrastive language-image pre-trained (CLIP) models have zero-shot ability of classifying
an image belonging to “[CLASS]” by using similarity between the image and the prompt …