Contact2Grasp: 3D Grasp Synthesis via Hand-Object Contact Constraint
Contact2Grasp: 3D Grasp Synthesis via Hand-Object Contact Constraint
Haoming Li, Xinzhuo Lin, Yang Zhou, Xiang Li, Yuchi Huo, Jiming Chen, Qi Ye
Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Main Track. Pages 1053-1061.
https://doi.org/10.24963/ijcai.2023/117
3D grasp synthesis generates grasping poses given an input object. Existing works tackle the problem by learning a direct mapping from objects to the distributions of grasping poses. However, because the physical contact is sensitive to small changes in pose, the high-nonlinear mapping between 3D object representation to valid poses is considerably non-smooth, leading to poor generation efficiency and restricted generality. To tackle the challenge, we introduce an intermediate variable for grasp contact areas to constrain the grasp generation; in other words, we factorize the mapping into two sequential stages by assuming that grasping poses are fully constrained given contact maps: 1) we first learn contact map distributions to generate the potential contact maps for grasps; 2) then learn a mapping from the contact maps to the grasping poses. Further, we propose a penetration-aware optimization with the generated contacts as a consistency constraint for grasp refinement. Extensive validations on two public datasets show that our method outperforms state-of-the-art methods regarding grasp generation on various metrics.
Keywords:
Computer Vision: CV: 3D computer vision
Computer Vision: CV: Applications