Siamese Labels Auxiliary Learning
Abstract
In deep learning, auxiliary training has been widely used to assist the training of models. During the training phase, using auxiliary modules to assist training can improve the performance of the model. During the testing phase, auxiliary modules can be removed, so the test parameters are not increased. In this paper, we propose a novel auxiliary training method, Siamese Labels Auxiliary Learning (SiLa). Unlike Deep Mutual Learning (DML), SiLa emphasizes auxiliary learning and can be easily combined with DML. In general, the main work of this paper include: (1) propose SiLa Learning, which improves the performance of common models without increasing test parameters; (2) compares SiLa with DML and proves that SiLa can improve the generalization of the model; (3) SiLa is applied to Dynamic Neural Networks, and proved that SiLa can be used for various types of network structures.
- Publication:
-
arXiv e-prints
- Pub Date:
- February 2021
- DOI:
- 10.48550/arXiv.2103.00200
- arXiv:
- arXiv:2103.00200
- Bibcode:
- 2021arXiv210300200G
- Keywords:
-
- Computer Science - Artificial Intelligence