Date of Award
1989
Document Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Electrical and Computer Engineering
First Advisor
Subhash C. Kak
Abstract
Over the past several years, several papers have been published on the capacity of Hopfield neural networks. It has been shown that, using the Hebbian rule, the capacity of the Hopfield model is approximately N/4logN. The number of patterns one can store in a neural network, however, can be greatly increased by using learning algorithms other than the Hebbian rule such as the delta rule. The motivation behind this dissertation is to study, both analytically and experimentally, the information capacity of various neural network models using a modified version of the delta-rule algorithm. This modified version of the delta-rule algorithm allows one to store significantly more patterns than previously thought possible. Both analytical and experimental results are presented on the number of patterns one can expect to be able to store using this algorithm. The analytical results suggest that the probability of separating m patterns of N bits each is about 50% for m = 2N; experimental results show that the probability of storing m patterns of N bits each is about 50% for m = 1.5N. Modifications of the Hopfield model including a non-binary model, a shift-invariant model, and models that use higher-order terms are also discussed. Learning rules for these models are presented along with discussion of their capacity. Also, the trade-off between capacity and performance of neural networks is discussed along with a further modification of the delta rule that leads to significant improvement in performance.
Recommended Citation
Prados, Donald Louis, "The Capacity of Artificial Neural Networks Using the Delta Rule." (1989). LSU Historical Dissertations and Theses. 4869.
https://repository.lsu.edu/gradschool_disstheses/4869
Pages
87
DOI
10.31390/gradschool_disstheses.4869