You are here
DEEP MAXOUT NETWORKS FOR CLASSIFICATION PROBLEMS ACROSS MULTIPLE DOMAINS
- Date Issued:
- 2019
- Abstract/Description:
- Machine learning techniques such as deep neural networks have become an indispensable tool for a wide range of applications such as image classification, speech recognition, and sentiment analysis in text. An activation function is a mathematical equation that determines the output of each neuron in the neural network. In deep learning architectures the choice of activation functions is very important to the network’s performance. Activation functions determine the output of the model, its computational efficiency, and its ability to train and converge after multiple iterations of training epochs. The selection of an activation function is critical to building and training an effective and efficient neural network. In real-world applications of deep neural networks, the activation function is a hyperparameter. We have observed a lack of consensus on how to select a good activation function for a deep neural network, and that a specific function may not be suitable for all domain-specific applications.
Title: | DEEP MAXOUT NETWORKS FOR CLASSIFICATION PROBLEMS ACROSS MULTIPLE DOMAINS. |
394 views
321 downloads |
---|---|---|
Name(s): |
Castaneda, Gabriel , author Khoshgoftaar, Taghi M. , Thesis advisor Florida Atlantic University, Degree grantor Department of Computer and Electrical Engineering and Computer Science College of Engineering and Computer Science |
|
Type of Resource: | text | |
Genre: | Electronic Thesis Or Dissertation | |
Date Created: | 2019 | |
Date Issued: | 2019 | |
Publisher: | Florida Atlantic University | |
Place of Publication: | Boca Raton, Fla. | |
Physical Form: | application/pdf | |
Extent: | 233 p. | |
Language(s): | English | |
Abstract/Description: | Machine learning techniques such as deep neural networks have become an indispensable tool for a wide range of applications such as image classification, speech recognition, and sentiment analysis in text. An activation function is a mathematical equation that determines the output of each neuron in the neural network. In deep learning architectures the choice of activation functions is very important to the network’s performance. Activation functions determine the output of the model, its computational efficiency, and its ability to train and converge after multiple iterations of training epochs. The selection of an activation function is critical to building and training an effective and efficient neural network. In real-world applications of deep neural networks, the activation function is a hyperparameter. We have observed a lack of consensus on how to select a good activation function for a deep neural network, and that a specific function may not be suitable for all domain-specific applications. | |
Identifier: | FA00013362 (IID) | |
Degree granted: | Dissertation (Ph.D.)--Florida Atlantic University, 2019. | |
Collection: | FAU Electronic Theses and Dissertations Collection | |
Note(s): | Includes bibliography. | |
Subject(s): |
Classification Machine learning--Technique Neural networks (Computer science) |
|
Held by: | Florida Atlantic University Libraries | |
Sublocation: | Digital Library | |
Persistent Link to This Record: | http://purl.flvc.org/fau/fd/FA00013362 | |
Use and Reproduction: | Copyright © is held by the author with permission granted to Florida Atlantic University to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. | |
Use and Reproduction: | http://rightsstatements.org/vocab/InC/1.0/ | |
Host Institution: | FAU | |
Is Part of Series: | Florida Atlantic University Digital Library Collections. |