You are here
BEHAVIORAL ANALYSIS OF DEEP CONVOLUTIONAL NEURAL NETWORKS FOR IMAGE CLASSIFICATION
- Date Issued:
- 2022
- Abstract/Description:
- Within Deep CNNs there is great excitement over breakthroughs in network performance on benchmark datasets such as ImageNet. Around the world competitive teams work on new ways to innovate and modify existing networks, or create new ones that can reach higher and higher accuracy levels. We believe that this important research must be supplemented with research into the computational dynamics of the networks themselves. We present research into network behavior as it is affected by: variations in the number of filters per layer, pruning filters during and after training, collapsing the weight space of the trained network using a basic quantization, and the effect of Image Size and Input Layer Stride on training time and test accuracy. We provide insights into how the total number of updatable parameters can affect training time and accuracy, and how “time per epoch” and “number of epochs” affect network training time. We conclude with statistically significant models that allow us to predict training time as a function of total number of updatable parameters in the network.
Title: | BEHAVIORAL ANALYSIS OF DEEP CONVOLUTIONAL NEURAL NETWORKS FOR IMAGE CLASSIFICATION. |
45 views
21 downloads |
---|---|---|
Name(s): |
Clark, James Alex , author Barenholtz, Elan , Thesis advisor Florida Atlantic University, Degree grantor Center for Complex Systems and Brain Sciences Charles E. Schmidt College of Science |
|
Type of Resource: | text | |
Genre: | Electronic Thesis Or Dissertation | |
Date Created: | 2022 | |
Date Issued: | 2022 | |
Publisher: | Florida Atlantic University | |
Place of Publication: | Boca Raton, Fla. | |
Physical Form: | application/pdf | |
Extent: | 151 p. | |
Language(s): | English | |
Abstract/Description: | Within Deep CNNs there is great excitement over breakthroughs in network performance on benchmark datasets such as ImageNet. Around the world competitive teams work on new ways to innovate and modify existing networks, or create new ones that can reach higher and higher accuracy levels. We believe that this important research must be supplemented with research into the computational dynamics of the networks themselves. We present research into network behavior as it is affected by: variations in the number of filters per layer, pruning filters during and after training, collapsing the weight space of the trained network using a basic quantization, and the effect of Image Size and Input Layer Stride on training time and test accuracy. We provide insights into how the total number of updatable parameters can affect training time and accuracy, and how “time per epoch” and “number of epochs” affect network training time. We conclude with statistically significant models that allow us to predict training time as a function of total number of updatable parameters in the network. | |
Identifier: | FA00013940 (IID) | |
Degree granted: | Dissertation (Ph.D.)--Florida Atlantic University, 2022. | |
Collection: | FAU Electronic Theses and Dissertations Collection | |
Note(s): | Includes bibliography. | |
Subject(s): |
Neural networks (Computer science) Image processing |
|
Persistent Link to This Record: | http://purl.flvc.org/fau/fd/FA00013940 | |
Use and Reproduction: | Copyright © is held by the author with permission granted to Florida Atlantic University to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. | |
Use and Reproduction: | http://rightsstatements.org/vocab/InC/1.0/ | |
Host Institution: | FAU | |
Is Part of Series: | Florida Atlantic University Digital Library Collections. |