- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Exploring internally dense but externally sparse deep...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Exploring internally dense but externally sparse deep convolutional neural networks Duan, Yiqun
Abstract
Recent years have witnessed two seemingly opposite developments of deep convolutional neural networks (CNNs). On the one hand, increasing the density of CNNs (e.g., by adding cross-layer connections) achieves better performance on basic computer vision tasks. On the other hand, creating sparsity structures (e.g., through pruning methods) achieves slimmer network structures. Inspired by modularity structures in the human brain, the researchers bridge these two trends by proposing a new network structure with internally dense yet externally sparse connections. Experimental results demonstrate that the proposed method could obtain competitive performance on benchmark tasks (CIFAR10, CIFAR100, and ImageNet) while keeping the network slim. Moreover, the researchers give out connection contributions by implementing a network damage experiment, which could indicate brief principles for hierarchical deep neural network design. The proposed method could achieve a 22% top-1 error on ImageNet classification at the computational budget of 3.38 GFLOPs. This means the proposed method could achieve 2% higher recognition accuracy than ResNet50 while only using 80% of the computation cost of it.
Item Metadata
Title |
Exploring internally dense but externally sparse deep convolutional neural networks
|
Creator | |
Publisher |
University of British Columbia
|
Date Issued |
2019
|
Description |
Recent years have witnessed two seemingly opposite developments of deep convolutional neural networks (CNNs). On the one hand, increasing the density of CNNs (e.g., by adding cross-layer connections) achieves better performance on basic computer vision tasks. On the other hand, creating sparsity structures (e.g., through pruning methods) achieves slimmer network structures. Inspired by modularity structures in the human brain, the researchers bridge these two trends by proposing a new network structure with internally dense yet externally sparse connections. Experimental results demonstrate that the proposed method could obtain competitive performance on benchmark tasks (CIFAR10, CIFAR100, and ImageNet) while keeping the network slim. Moreover, the researchers give out connection contributions by implementing a network damage experiment, which could indicate brief principles for hierarchical deep neural network design. The proposed method could achieve a 22% top-1 error on ImageNet classification at the computational budget of 3.38 GFLOPs. This means the proposed method could achieve 2% higher recognition accuracy than ResNet50 while only using 80% of the computation cost of it.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2019-11-30
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0384513
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2019-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International