You are here
Information-theoretics based genetic algorithm: Application to Hopfield's associative memory model of neural networks
- Date Issued:
- 1997
- Summary:
- This thesis refers to a research addressing the use of information-theoretic techniques in optimizing an artificial neural network (ANN) via a genetic selection algorithm. Pertinent studies address emulating relevant experiments on a test ANN (based on Hopfield's associative memory model) wherein the said optimization is tried with different sets of control parameters. These parameters include a new entity based on the concept of entropy as conceived in the field of information theory. That is, the mutual entropy (Shannon entropy) or information-distance (Kullback-Leibler-Jensen distance) measure between a pair of candidates is considered in the reproduction process of the genetic algorithm (GA) and adopted as a selection-constraint parameter. The research envisaged further includes a comparative analysis of the test results which indicate the importance of proper parameter selection to realize an optimal network performance. It also demonstrates the ability of the concepts proposed here in developing a new neural network approach for pattern recognition problems.
Title: | Information-theoretics based genetic algorithm: Application to Hopfield's associative memory model of neural networks. |
114 views
34 downloads |
---|---|---|
Name(s): |
Arredondo, Tomas Vidal. Florida Atlantic University, Degree grantor Neelakanta, Perambur S., Thesis advisor |
|
Type of Resource: | text | |
Genre: | Electronic Thesis Or Dissertation | |
Issuance: | monographic | |
Date Issued: | 1997 | |
Publisher: | Florida Atlantic University | |
Place of Publication: | Boca Raton, Fla. | |
Physical Form: | application/pdf | |
Extent: | 145 p. | |
Language(s): | English | |
Summary: | This thesis refers to a research addressing the use of information-theoretic techniques in optimizing an artificial neural network (ANN) via a genetic selection algorithm. Pertinent studies address emulating relevant experiments on a test ANN (based on Hopfield's associative memory model) wherein the said optimization is tried with different sets of control parameters. These parameters include a new entity based on the concept of entropy as conceived in the field of information theory. That is, the mutual entropy (Shannon entropy) or information-distance (Kullback-Leibler-Jensen distance) measure between a pair of candidates is considered in the reproduction process of the genetic algorithm (GA) and adopted as a selection-constraint parameter. The research envisaged further includes a comparative analysis of the test results which indicate the importance of proper parameter selection to realize an optimal network performance. It also demonstrates the ability of the concepts proposed here in developing a new neural network approach for pattern recognition problems. | |
Identifier: | 9780591333909 (isbn), 15397 (digitool), FADT15397 (IID), fau:12164 (fedora) | |
Collection: | FAU Electronic Theses and Dissertations Collection | |
Note(s): |
College of Engineering and Computer Science Thesis (M.S.)--Florida Atlantic University, 1997. |
|
Subject(s): |
Neuro network (Computer science) Genetic algorithms |
|
Held by: | Florida Atlantic University Libraries | |
Persistent Link to This Record: | http://purl.flvc.org/fcla/dt/15397 | |
Sublocation: | Digital Library | |
Use and Reproduction: | Copyright © is held by the author, with permission granted to Florida Atlantic University to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. | |
Use and Reproduction: | http://rightsstatements.org/vocab/InC/1.0/ | |
Host Institution: | FAU | |
Is Part of Series: | Florida Atlantic University Digital Library Collections. |