Articole cu mandate pentru acces public - Dilin WangAflați mai multe
Disponibile undeva: 13
Stein variational gradient descent: A general purpose bayesian inference algorithm
Q Liu, D Wang
Advances in neural information processing systems (NeurIPS), 2378-2386, 2016
Mandate: US National Science Foundation
KeepAugment: A Simple Information-Preserving Data Augmentation Approach
C Gong, D Wang, M Li, V Chandra, Q Liu
Conference on Computer Vision and Pattern Recognition (CVPR) 2021, 2020
Mandate: US National Science Foundation
Improving neural language modeling via adversarial training
D Wang, C Gong, Q Liu
Proceedings of the 36th International Conference on Machine Learning (ICML), 2019
Mandate: US National Science Foundation
Nasvit: Neural architecture search for efficient vision transformers with gradient conflict-aware supernet training
C Gong, D Wang
ICLR Proceedings 2022, 2022
Mandate: US National Science Foundation
Stein variational gradient descent with matrix-valued kernels
D Wang, Z Tang, C Bajaj, Q Liu
Advances in neural information processing systems (NeurIPS), 7836-7846, 2019
Mandate: US National Science Foundation, US National Institutes of Health
Variational inference with tail-adaptive f-divergence
D Wang, H Liu, Q Liu
Advances in Neural Information Processing Systems (NeurIPS), 5737-5747, 2018
Mandate: US National Science Foundation
AlphaMatch: Improving Consistency for Semi-supervised Learning with Alpha-divergence
C Gong, D Wang, Q Liu
Conference on Computer Vision and Pattern Recognition (CVPR) 2021, 2020
Mandate: US National Science Foundation
Splitting steepest descent for growing neural architectures
Q Liu, L Wu, D Wang
Advances in Neural Information Processing Systems (NeurIPS), 10656-10666, 2019
Mandate: US National Science Foundation
Stein Variational Message Passing for Continuous Graphical Models
D Wang, Z Zeng, Q Liu
International Conference on Machine Learning (ICML), 2017
Mandate: US National Science Foundation
Nonlinear stein variational gradient descent for learning diversified mixture models
D Wang, Q Liu
International Conference on Machine Learning (ICML), 6576-6585, 2019
Mandate: US National Science Foundation
Energy-aware neural architecture optimization with fast splitting steepest descent
D Wang, M Li, L Wu, V Chandra, Q Liu
arXiv preprint arXiv:1910.03103, 2019
Mandate: US National Science Foundation
Efficient Observation Selection in Probabilistic Graphical Models Using Bayesian Lower Bounds.
D Wang, JW Fisher III, Q Liu
Conference on Uncertainty in Artificial Intelligence (UAI), 2016
Mandate: US National Science Foundation
Sparse Cocktail: Every Sparse Pattern Every Sparse Ratio All At Once
LI Zhangheng, S Liu, T Chen, AK Jaiswal, Z Zhang, D Wang, ...
Forty-first International Conference on Machine Learning, 2023
Mandate: US National Science Foundation
Informațiile despre publicații și finanțare sunt alese automat de un program informatic