Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks

Authors

  • Yehui Tang Peking University
  • Yunhe Wang Huawei Noah’s Ark Lab
  • Yixing Xu Huawei Noah’s Ark Lab
  • Boxin Shi Peking University
  • Chao Xu Peking University
  • Chunjing Xu Huawei Noahs Ark Lab
  • Chang Xu The University of Sydney

DOI:

https://doi.org/10.1609/aaai.v34i04.6057

Abstract

Deep neural networks often consist of a great number of trainable parameters for extracting powerful features from given datasets. One one hand, massive trainable parameters significantly enhance the performance of these deep networks. One the other hand, they bring the problem of over-fitting. To this end, dropout based methods disable some elements in the output feature maps during the training phase for reducing the co-adaptation of neurons. Although the generalization ability of the resulting models can be enhanced by these approaches, the conventional binary dropout is not the optimal solution. Therefore, we investigate the empirical Rademacher complexity related to intermediate layers of deep neural networks and propose a feature distortion method for addressing the aforementioned problem. In the training period, randomly selected elements in the feature maps will be replaced with specific values by exploiting the generalization error bound. The superiority of the proposed feature map distortion for producing deep neural network with higher testing performance is analyzed and demonstrated on several benchmark image datasets.

Downloads

Published

2020-04-03

How to Cite

Tang, Y., Wang, Y., Xu, Y., Shi, B., Xu, C., Xu, C., & Xu, C. (2020). Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5964-5971. https://doi.org/10.1609/aaai.v34i04.6057

Issue

Section

AAAI Technical Track: Machine Learning