Event-Driven Visual-Tactile Sensing and Learning for Robots

Human beings perform a lot of actions using multiple sensory modalities and consume less energy than multi-modal deep neural networks used in current artificial systems. A recent study on arXiv.org proposes an asynchronous and event-driven visual-tactile perception system, inspired by biological systems.

A novel fingertip tactile sensor is created, and a visual-tactile spiking neural network is developed. In contrary to usual neural networks, it can process discrete spikes asynchronously. The robots had to determine the type of container they handle, the amount of liquid held within, and to detect rotational slip. Spiking neural networks achieved competitive performance when compared to artificial neural networks and consumed approximately 1900 times less power than GPU in a real-time simulation. This study opens the door to next-generation real-time autonomous robots that are power-efficient.

This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.

Link: https://arxiv.org/abs/2009.07083

Source