New Approach to Neural Network Training: Teaching with Commentaries

We, as humans, have got the power to use our visual senses in the best possible way. Effortlessly, the supercomputers stored in our head work towards making us distinguish between the visual scenarios and further remember them. Millions of neurons work in connection to help us visualize things. This forms the neural network.

Machine-based neural networks are like us in a sense that they need to learn specifics of information processing required to accomplish certain task. The teaching-learning process is not always straightforward, because many difficulties are associated with construction of artificial network, increased need for computational resources, and a multitude of other factors that put limits on effectiveness of training.

Image credit: chenspec via Pixabay, free licence

Therefore, an effective implementation of deep neural network training is a challenging task and comes with many questions. Nevertheless, scientists are making gradual advances in this area. One of the latest works related to increasing the performance of the network teaching-learning process recently has been published on arXiv and presents a deeper analysis of a new promising approach – teaching with commentaries.

What are Commentaries?

According to the authors of this work, their concept of commentaries represents a meta-learned information which is provided from external sources other than the neural network itself, and which is used to adjust the training process before the final model is created for a particular task.

The researchers explain: “We define a commentary to be learned information helpful for training a model on a task or providing insights on the learning process. <…> The commentary may be represented in a tabular fashion for every combination of input arguments, or using a neural network that takes these arguments as inputs”.

The article notes that the proposed neural network teaching process which utilizes the meta-learned commentaries helps gain speed in training, provides a good overall insight into neural networks, and can be potentially used for other applications. The team also suggested a unifying framework to examine model learning and further improve network training.

Commentaries for Example Weighting Curricula

For the practical analysis of performance gains when applying teaching with commentaries, the researchers used a synthetic MNIST binary classification problem, training using CIFAR10/100 datasets,  and explore possibility to use commentaries for the few-shot learning.

A separate weight is allocated for training through commentary neural network. Here weights of individual training examples are used to study teaching with commentaries, and these weights are specified at every iteration of training. Sets of commentaries are learned initially on rotated MNIST digits, then on CIFAR10 and CIFAR100, and lastly for few-shot learning.

Learning to Blend Training Examples

In this application, a commentary-based augmented scheme is used in a task where the parts of images are blended together, with aim to control the blending factor. All the labels are sampled from the training sets, and a blending proportion is obtained for the generation of a new resulting (blended) image. The training loss is calculated using the blended example-label pair. This study is also performed using MNIST and CIFAR10/100 datasets.

Attention Mask Commentaries for Insights and Robustness

As the authors note, this is an important and challenging task – to make commentaries learn to identify the key factors in the data set. In this part of the study, commentaries are defined as ‘attention masks’ that help to learn important regions of provided sets of images. A qualitative and quantitative study using the different image datasets is performed, and the results indicate that such masks lead to qualitatively sensible classification outcomes, as well as provide increased effectiveness compared to the baseline.

Concluding remarks

In the conclusion section, the authors note that commentaries prove to be an effective way to focus on the neural network training objectives. Although currently it is a bit difficult to get hands-on experience in teaching with commentaries as this concept is relatively new, the method promises the possibility to attain significantly increased performance levels, compared to currently utilized neural network training techniques.

“Empirically, we show that the commentaries can provide insights and result in improved learning speed and/or performance on a variety of datasets. Teaching with commentaries is a proof-of-concept idea, and we hope that this approach will inspire related ways of automatically re-using training insights across tasks and datasets”, the researchers conclude.

Link to the research article: https://arxiv.org/abs/2011.03037

Source

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x