SparseNet: Deep Convolutional Network with Sparse Connections between Blocks
Conference: ICMLCA 2021 - 2nd International Conference on Machine Learning and Computer Application
12/17/2021 - 12/19/2021 at Shenyang, China
Proceedings: ICMLCA 2021
Pages: 5Language: englishTyp: PDF
Personal VDE Members are entitled to a 10% discount on this title
Authors:
Wang, Honglin; Yang, Jinping; Kou, Wanting; Chen, Xinru; Li, Jun; Li, Yunfan (Shenyang Ligong University, School of Information Science and Engineering, Shenyang China)
Abstract:
With the developing of Deep Learning technology, optimizing structure of Deep Convolutional Neural Networks has become one of the hottest research. This paper basing on previous achievements advances a new theory of SparseNet. Compared to traditional network models, the most important characteristic of SparseNet is to pick out some blocks of model instead of the whole model to train at training time. Under the condition of ensuring the accuracy of the model, the actual training parameters are greatly reduced and the demand of computational power of training model is reduced. In this paper, the model is composed of microcell convolution blocks which are composed of a Convolutional layer and a Normalization layer, and the Convolutional layer uses small size convolution kernel. For the selection of training blocks, Evolutionary Blocking Screening Algorithm: screening blocks trained in a way of maintaining an array recording weight is proposed in this paper. In the experimental part, three SparseNet models with different depths are constructed. Meanwhile, using DenseNet and ResNet of different depths makes a comparison. The results proved the correctness of the optimized model by CIFAR Dataset. Compared with the current classic network structure, using SparseNet and Evolutionary Blocking Screening Algorithm has obvious advantages in training parameters and training speed, which established requirements.