Reactive Human-to-Robot Handovers of Arbitrary Objects

Human-robot handing is a useful feature that can be especially helpful for older people or people with disabled mobility. Such a system may be constrained in approach directions by the human’s pose and requires motions that are intuitive and feel safe for humans. A recent paper suggests a vision-based system for human-robot handovers suitable for unknown objects.

Industrial robot. Image credit: jarmoluk via Pixabay (Free Pixabay licence)

Given RGBD images and body tracking, the system crops a point-cloud containing hand and object. A grasp planner for static objects GraspNet is improved for use with humans, which may move during the handing. Grasps are generated in real-time and are temporarily consistent. During the experiments, the robotic grasping with 26 household objects, such as cell phone, mug, or newspaper, was evaluated. The robot was able to grasp arbitrary objects from naïve users, who evaluated the robot as safe and able to adjust to human motions.

Human-robot object handovers have been an actively studied area of robotics over the past decade; however, very few techniques and systems have addressed the challenge of handing over diverse objects with arbitrary appearance, size, shape, and rigidity. In this paper, we present a vision-based system that enables reactive human-to-robot handovers of unknown objects. Our approach combines closed-loop motion planning with real-time, temporally-consistent grasp generation to ensure reactivity and motion smoothness. Our system is robust to different object positions and orientations, and can grasp both rigid and non-rigid objects. We demonstrate the generalizability, usability, and robustness of our approach on a novel benchmark set of 26 diverse household objects, a user study with naive users (N=6) handing over a subset of 15 objects, and a systematic evaluation examining different ways of handing objects. More results and videos can be found at this https URL.

Link: https://arxiv.org/abs/2011.08961

Source