Ab Initio Particle-based Object Manipulation

Previous methods of robotic manipulation have relied on two different techniques. While model-based approaches capture the object’s properties in an analytic model, data-driven methods learn directly from prior experiences. A recent study proposes Particle-based Object Manipulation (PROMPT), which combines the advantages of both approaches.

Image credit: NASA

A particle representation is constructed from a set of RGB images. Here, each particle represents a point in the object, the local features, and the relation with other particles. For each camera view, the particles are projected into the image plane. Then, the reconstructed particle set is used as an approximate representation of the object.

Particle-based dynamics simulation predicts the effects of manipulation actions. The experimental results show that PROMPT enables robots to achieve dynamic manipulation on various tasks, including grasping, pushing, and placing.

This paper presents Particle-based Object Manipulation (Prompt), a new approach to robot manipulation of novel objects ab initio, without prior object models or pre-training on a large object data set. The key element of Prompt is a particle-based object representation, in which each particle represents a point in the object, the local geometric, physical, and other features of the point, and also its relation with other particles. Like the model-based analytic approaches to manipulation, the particle representation enables the robot to reason about the object’s geometry and dynamics in order to choose suitable manipulation actions. Like the data-driven approaches, the particle representation is learned online in real-time from visual sensor input, specifically, multi-view RGB images. The particle representation thus connects visual perception with robot control. Prompt combines the benefits of both model-based reasoning and data-driven learning. We show empirically that Prompt successfully handles a variety of everyday objects, some of which are transparent. It handles various manipulation tasks, including grasping, pushing, etc,. Our experiments also show that Prompt outperforms a state-of-the-art data-driven grasping method on the daily objects, even though it does not use any offline training data.

Research paper: Chen, S., Ma, X., Lu, Y., and Hsu, D., “Ab Initio Particle-based Object Manipulation”, 2021. Link: https://arxiv.org/abs/2107.08865

Source

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x