Precise Object Placement with Pose Distance Estimations for Different Objects and Grippers

In order to grasp and manipulate objects in undefined poses, robots must perceive their environment and plan corresponding actions accordingly.

Industrial robot. Image credit: jarmoluk via Pixabay (Free Pixabay licence)

A recent study on arXiv.org focuses on robotic bin-picking, where multiple rigid objects of different types are stored chaotically in a bin. The robot has to pick the objects and place them at a given target pose. That is a challenging task because of occlusions, varying lighting conditions, and collisions.

The researchers propose a multi-gripper approach that executes grasping trials in simulation and transfers the experience to the real world. The approach solves 6D object pose estimation and object classification and grasps quality prediction tasks. It is automatically decided which object with which gripper, including grasp pose, is best suited for execution.

The approach can also be used for tasks like shelf picking, depalletizing, or conveyor belt picking.

This paper introduces a novel approach for the grasping and precise placement of various known rigid objects using multiple grippers within highly cluttered scenes. Using a single depth image of the scene, our method estimates multiple 6D object poses together with an object class, a pose distance for object pose estimation, and a pose distance from a target pose for object placement for each automatically obtained grasp pose with a single forward pass of a neural network. By incorporating model knowledge into the system, our approach has higher success rates for grasping than state-of-the-art model-free approaches. Furthermore, our method chooses grasps that result in significantly more precise object placements than prior model-based work.

Research paper: Kleeberger, K., Schnitzler, J., Usman Khalid, M., Bormann, R., Kraus, W., and Huber, M. F., “Precise Object Placement with Pose Distance Estimations for Different Objects and Grippers”, 2021. Link: https://arxiv.org/abs/2110.00992

Source

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x