دنبال کردن
Haoran Geng
Haoran Geng
PhD Student at UC Berkeley
ایمیل تأیید شده در berkeley.edu - صفحهٔ اصلی
عنوان
نقل شده توسط
نقل شده توسط
سال
UniDexGrasp: Universal robotic dexterous grasping via learning diverse proposal generation and goal-conditioned policy
Y Xu, W Wan, J Zhang, H Liu, Z Shan, H Shen, R Wang, H Geng, Y Weng, ...
CVPR 2023, 2023
952023
GAPartNet: Cross-Category Domain-Generalizable Object Perception and Manipulation via Generalizable and Actionable Parts
H Geng, H Xu, C Zhao, C Xu, L Yi, S Huang, H Wang
CVPR 2023 Highlight, 2022
792022
UniDexGrasp++: Improving Dexterous Grasping Policy Learning via Geometry-aware Curriculum and Iterative Generalist-Specialist Learning
W Wan*, H Geng*, Y Liu, Z Shan, Y Yang, L Yi, H Wang
ICCV 2023 Oral & Best Paper Award Finalist, 2023
732023
RLAfford: End-to-End Affordance Learning for Robotic Manipulation
Y Geng, B An, H Geng, Y Chen, Y Yang, H Dong
ICRA 2023, 2022
642022
Shapellm: Universal 3d object understanding for embodied interaction
Z Qi, R Dong, S Zhang, H Geng, C Han, Z Ge, L Yi, K Ma
ECCV 2024, 2025
442025
ARNOLD: A Benchmark for Language-Grounded Task Learning With Continuous States in Realistic 3D Scenes
Ran Gong, Jiangyong Huang, Yizhou Zhao, Haoran Geng, Xiaofeng Gao, Qingyang ...
ICCV 2023, 2023
41*2023
ManipLLM: Embodied Multimodal Large Language Model for Object-Centric Robotic Manipulation
X Li, M Zhang, Y Geng, H Geng, Y Long, Y Shen, R Zhang, J Liu, H Dong
CVPR 2024, 2023
402023
PartManip: Learning Cross-Category Generalizable Part Manipulation Policy from Point Cloud Observations
H Geng, Z Li, Y Geng, J Chen, H Dong, H Wang
CVPR 2023, 2023
352023
SAGE: Bridging Semantic and Actionable parts for GEneralizable articulated-object manipulation under language instructions
H Geng, S Wei, C Deng, B Shen, H Wang, L Guibas
RSS 2024, 2023
16*2023
RAM: Retrieval-Based Affordance Transfer for Generalizable Zero-Shot Robotic Manipulation
Y Kuang*, J Ye*, H Geng*, J Mao, C Deng, L Guibas, H Wang, Y Wang
CoRL 2024, Oral Presentation, 2024
132024
Ag2Manip: Learning Novel Manipulation Skills with Agent-Agnostic Visual and Action Representations
P Li, T Liu, Y Li, M Han, H Geng, S Wang, Y Zhu, SC Zhu, S Huang
IROS 2024, 2024
92024
Open6DOR: Benchmarking open-instruction 6-dof object rearrangement and a VLM-based approach
Y Ding, H Geng, C Xu, X Fang, J Zhang, S Wei, Q Dai, Z Zhang, H Wang
IROS 2024, 2024
62024
Make a Donut: Language-Guided Hierarchical EMD-Space Planning for Zero-shot Deformable Object Manipulation
Y You, B Shen, C Deng, H Geng, H Wang, L Guibas
IEEE Robotics and Automation Letter (RA-L), 2023
32023
DexGraspNet 2.0: Learning Generative Dexterous Grasping in Large-scale Synthetic Cluttered Scenes
J Zhang, H Liu, D Li, XQ Yu, H Geng, Y Ding, J Chen, H Wang
CoRL 2024, 2024
22024
D RoMa: Disparity Diffusion-based Depth Sensing for Material-Agnostic Robotic Manipulation
S Wei, H Geng, J Chen, C Deng, C Wenbo, C Zhao, X Fang, L Guibas, ...
CoRL 2024, 2024
22024
Physpart: Physically plausible part completion for interactable objects
R Luo*, H Geng*, C Deng, P Li, Z Wang, B Jia, L Guibas, S Huang
ICRA 2025, 2024
22024
Learning from Massive Human Videos for Universal Humanoid Pose Control
J Mao, S Zhao, S Song, T Shi, J Ye, M Zhang, H Geng, J Malik, V Guizilini, ...
arXiv preprint arXiv:2412.14172, 2024
12024
GAPartManip: A Large-scale Part-centric Dataset for Material-Agnostic Articulated Object Manipulation
W Cui, C Zhao, S Wei, J Zhang, H Geng, Y Chen, H Wang
ICRA 2025, 2024
2024
FreeCG: Free the Design Space of Clebsch-Gordan Transform for Machine Learning Force Fields
S Shao, H Geng, Z Wang, Q Cui
ICLR 2025, 2024
2024
Generative Models for Robot Learning
Z Wang, C Deng, C Liu, Z Jiang, H Geng, H Xu, Y Tang, P Torr, Z Liu, ...
ICLR 2025 Workshop Proposals, 0
سیستم در حال حاضر قادر به انجام عملکرد نیست. بعداً دوباره امتحان کنید.
مقاله‌ها 1–20