You are here
Maximum entropy-based optimization of artificial neural networks: An application to ATM telecommunication parameter predictions
- Date Issued:
- 1999
- Summary:
- This thesis addresses studies on cost-functions developed on the basis of maximum entropy principle, for applications in artificial neural network (ANN) optimization endeavors. The maximization of entropy refers to maximizing Shannon information pertinent to the difference in the output and the teacher value of an ANN. Apart from the Shannon format of the negative entropy formulation a set of Csiszar family functions are also considered. The error-measures obtained, via these maximum entropy formulations are adopted as cost-functions in the training and prediction schedules of a test perceptron. A comparative study is done on the performance of these cost-functions in facilitating the test network towards optimization so as to predict a standard teacher function sin (.). The study is also extended to predict a parameter (such as cell delay variation) in a practical ATM telecommunication system. Concluding remarks and scope for an extended study are also indicated.
Title: | Maximum entropy-based optimization of artificial neural networks: An application to ATM telecommunication parameter predictions. |
84 views
26 downloads |
---|---|---|
Name(s): |
Sundaram, Karthik. Florida Atlantic University, Degree grantor De Groff, Dolores F., Thesis advisor Neelakanta, Perambur S., Thesis advisor College of Engineering and Computer Science Department of Computer and Electrical Engineering and Computer Science |
|
Type of Resource: | text | |
Genre: | Electronic Thesis Or Dissertation | |
Issuance: | monographic | |
Date Issued: | 1999 | |
Publisher: | Florida Atlantic University | |
Place of Publication: | Boca Raton, Fla. | |
Physical Form: | application/pdf | |
Extent: | 160 p. | |
Language(s): | English | |
Summary: | This thesis addresses studies on cost-functions developed on the basis of maximum entropy principle, for applications in artificial neural network (ANN) optimization endeavors. The maximization of entropy refers to maximizing Shannon information pertinent to the difference in the output and the teacher value of an ANN. Apart from the Shannon format of the negative entropy formulation a set of Csiszar family functions are also considered. The error-measures obtained, via these maximum entropy formulations are adopted as cost-functions in the training and prediction schedules of a test perceptron. A comparative study is done on the performance of these cost-functions in facilitating the test network towards optimization so as to predict a standard teacher function sin (.). The study is also extended to predict a parameter (such as cell delay variation) in a practical ATM telecommunication system. Concluding remarks and scope for an extended study are also indicated. | |
Identifier: | 9780599218833 (isbn), 15660 (digitool), FADT15660 (IID), fau:12732 (fedora) | |
Collection: | FAU Electronic Theses and Dissertations Collection | |
Note(s): |
College of Engineering and Computer Science Thesis (M.S.)--Florida Atlantic University, 1999. |
|
Subject(s): |
Neural network (Computer science) Asynchronous transfer mode |
|
Held by: | Florida Atlantic University Libraries | |
Persistent Link to This Record: | http://purl.flvc.org/fcla/dt/15660 | |
Sublocation: | Digital Library | |
Use and Reproduction: | Copyright © is held by the author, with permission granted to Florida Atlantic University to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. | |
Use and Reproduction: | http://rightsstatements.org/vocab/InC/1.0/ | |
Host Institution: | FAU | |
Is Part of Series: | Florida Atlantic University Digital Library Collections. |