UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Exploring internally dense but externally sparse deep convolutional neural networks Duan, Yiqun


Recent years have witnessed two seemingly opposite developments of deep convolutional neural networks (CNNs). On the one hand, increasing the density of CNNs (e.g., by adding cross-layer connections) achieves better performance on basic computer vision tasks. On the other hand, creating sparsity structures (e.g., through pruning methods) achieves slimmer network structures. Inspired by modularity structures in the human brain, the researchers bridge these two trends by proposing a new network structure with internally dense yet externally sparse connections. Experimental results demonstrate that the proposed method could obtain competitive performance on benchmark tasks (CIFAR10, CIFAR100, and ImageNet) while keeping the network slim. Moreover, the researchers give out connection contributions by implementing a network damage experiment, which could indicate brief principles for hierarchical deep neural network design. The proposed method could achieve a 22% top-1 error on ImageNet classification at the computational budget of 3.38 GFLOPs. This means the proposed method could achieve 2% higher recognition accuracy than ResNet50 while only using 80% of the computation cost of it.

Item Media

Item Citations and Data


Attribution-NonCommercial-NoDerivatives 4.0 International