- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- The development of VLSI implementation-related neural...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
The development of VLSI implementation-related neural network training algorithms Li, Gang
Abstract
Artificial neural networks are systems composed of interconnected simple computing units known as artificial neurons which simulate some properties of their biological counterparts. They have been developed and studied for understanding how brains function, and for computational purposes. Two kinds of architecture of Neural Network Models( NNMs) are the most popular, Recurrent and Feed-forward. The recurrent model (Hopfield network) is one of the simplest NNMs. It is specially designed as a Content Addressable Memory (CAM). The feed-forward models include Perceptron and multilayer perceptrons. They have been proved to be useful in many applications. Two parts are included in this thesis. In part 1, recurrent models are investigated and a novel digital CMOS VLSI implementation scheme is proposed. Synaptic matrix construction rules (training rules) for the proposed model were simulated on SUN work stations. Three widely accepted training rules are simulated and compared, the Hebb's rule, the Projection rule and the Simplex method. A coding scheme, named "dilution", is applied to the three training rules. Both Dilution Coding Hebb's rule and Dilution Coding Projection rule were verified to exhibit good performance. In part 2 of the thesis, feed-forward models are introduced. Variations of the BP algorithm are developed together with the considerations of hardware implementations. The proposed DPT (Delta Pre-Training) method can speed up BP training and can also reduce the probability of getting trapped in a local minima. The implementation of the DPT method will not increase the complexity of VLSI design.
Item Metadata
Title |
The development of VLSI implementation-related neural network training algorithms
|
Creator | |
Publisher |
University of British Columbia
|
Date Issued |
1994
|
Description |
Artificial neural networks are systems composed of interconnected simple computing units
known as artificial neurons which simulate some properties of their biological counterparts.
They have been developed and studied for understanding how brains function,
and for computational purposes. Two kinds of architecture of Neural Network Models(
NNMs) are the most popular, Recurrent and Feed-forward. The recurrent model
(Hopfield network) is one of the simplest NNMs. It is specially designed as a Content
Addressable Memory (CAM). The feed-forward models include Perceptron and multilayer
perceptrons. They have been proved to be useful in many applications.
Two parts are included in this thesis. In part 1, recurrent models are investigated
and a novel digital CMOS VLSI implementation scheme is proposed. Synaptic matrix
construction rules (training rules) for the proposed model were simulated on SUN work
stations. Three widely accepted training rules are simulated and compared, the Hebb's
rule, the Projection rule and the Simplex method. A coding scheme, named "dilution",
is applied to the three training rules. Both Dilution Coding Hebb's rule and Dilution
Coding Projection rule were verified to exhibit good performance.
In part 2 of the thesis, feed-forward models are introduced. Variations of the BP
algorithm are developed together with the considerations of hardware implementations.
The proposed DPT (Delta Pre-Training) method can speed up BP training and can also
reduce the probability of getting trapped in a local minima. The implementation of the
DPT method will not increase the complexity of VLSI design.
|
Extent |
4283047 bytes
|
Genre | |
Type | |
File Format |
application/pdf
|
Language |
eng
|
Date Available |
2009-03-03
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.
|
DOI |
10.14288/1.0087514
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
1994-11
|
Campus | |
Scholarly Level |
Graduate
|
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
For non-commercial purposes only, such as research, private study and education. Additional conditions apply, see Terms of Use https://open.library.ubc.ca/terms_of_use.