You are on page 1of 31

Lectured by Ha Hoang Kha, Ph.D.

Ho Chi Minh City University of Technology


Email: hahoangkha@gmail.com
Unconstrained Optimization
and Neural Networks
Ho Chi Minh City University of Technology
Faculty of Electrical and Electronics Engineering
Department of Telecommunications
Introduction
Single neuron training
Backpropagation algorithm
Character recognition
Content
Neural Networks 2 H. H. Kha
References
E. K. P. Chong and S. H. Zak, An Introduction to
Optimization, Jonh Wiley & Sons, 2001
Neural Networks 3 H. H. Kha
1. Introduction
Neural Networks 4 H. H. Kha
Neural networks have found numerous practical
applications: telephone echo cancellation, EEG data
interpretation.
The essence of neural networks lies in the connection
weights between neurons. The selection of these
weights is referred as training or learning.
A popular method for training a neural network is
called the backpropagation algorithm, based on an
unconstrained optimization, and associated gradient
algorithm applied to the problem.


1. Introduction
Neural Networks 5 H. H. Kha
An artificial neural networks is a circuit composed of
interconnected simple circuit element called neurons.
Each neuron represents a map, typically with multiple
inputs and a single output.
The output of the neuron is a function of the sum of the
inputs.


1. Introduction
The function of the output of the neuron is called the
activation function.
The single output of the neuron may be applied as
inputs to several other neurons.

Neural Networks 6 H. H. Kha
1. Introduction
Feedforward neural network: neurons are
interconnected in layers, so that the data flow only in
one direction.

Neural Networks 7 H. H. Kha
1. Introduction

Neural Networks 8 H. H. Kha
1. Introduction

Neural Networks 9 H. H. Kha
2. Single-Neural Training
Consider a single neuron
Neural Networks 10 H. H. Kha
2. Single-Neural Training

Neural Networks 11 H. H. Kha
2. Single-Neural Training

Neural Networks 12 H. H. Kha
A gradient method
2. Single-Neural Training-Adaline

Neural Networks 13 H. H. Kha
Adaptive linear element
3. Backpropagation Algorithm

Neural Networks 14 H. H. Kha
z
j
3. Backpropagation Algorithm

Neural Networks 15 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 16 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 17 H. H. Kha
3. Backpropagation Algorithm









To solve the above optimization problem, we use a
gradient algorithm with fixed step size

Neural Networks 18 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 19 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 20 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 21 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 22 H. H. Kha
3. Backpropagation Algorithm

Neural Networks 23 H. H. Kha
4. Applications: pattern recognition
A pattern is an object, process or event that can be
given a name.
A pattern class (or category) is a set of patterns
sharing common attributes and usually originating
from the same source.
During recognition (or classification) given objects
are assigned to prescribed classes.
A classifier is a machine which performs
classification.
Neural Networks 24 H. H. Kha
Examples of applications
Optical Character
Recognition (OCR)
Biometrics

Diagnostic systems

Military applications
Handwritten: sorting letters by postal
code, input device for PDAs.
Printed texts: reading machines for blind
people, digitalization of text documents.
Face recognition, verification, retrieval.
Finger prints recognition.
Speech recognition.
Medical diagnosis: X-Ray, EKG analysis.
Machine diagnostics, waster detection.
Automated Target Recognition (ATR).
Image segmentation and analysis
(recognition from aerial or satelite
photographs).
Neural Networks 25 H. H. Kha
Basic concepts
y
x =
(
(
(
(

n
x
x
x

2
1
Feature vector
- A vector of observations
(measurements).
- is a point in feature space .
Hidden state
- Cannot be directly measured.
- Patterns with equal hidden state belong to the same
class.
X e x
x X
Y e y
Task
- To design a classifer (decision rule)
which decides about a hidden state based on an onbservation.
Y X : q
Pattern
Neural Networks 26 H. H. Kha
Components of PR system
Sensors and
preprocessin
g
Feature
extraction
Classifier
Class
assignment
Sensors and preprocessing.
A feature extraction aims to create discriminative features good for
classification.
A classifier.
A teacher provides information about hidden state -- supervised learning.
A learning algorithm sets PR from training examples.

Learning algorithm
Teacher
Patter
n
Neural Networks 27 H. H. Kha
Character recognition
Recognition of both printed and handwritten characters
is a typical domain where neural networks have been
successfully applied.
Optical character recognition systems were among the
first commercial applications of neural networks.
For simplicity, we can limit our task to the recognition
of digits from 0 to 9. Each digit is represented by a 5x9
bit map.
In commercial applications, where a better resolution is
required, at least 16 x16 bit maps are used.

Neural Networks 28 H. H. Kha

Bit maps for digit recognition


Neural Networks 29 H. H. Kha
7 8 9 10
12 13 14 15
17 18 19 20
26 27 28 29
31 32 33 34
36 37 38 39
6
2 3 4 5 1
16
11
22 23 24 25 21
42 43 44 45 41
35
40
30

Architecture of a neural network


The number of neurons in the input layer is decided by the
number of pixels in the bit map. The bit map in our
example consists of 45 pixels, and thus we need 45 input
neurons.
The output layer has 10 neurons one neuron for each
digit to be recognised.
Complex patterns cannot be detected by a small number of
hidden neurons; however too many of them can
dramatically increase the computational burden.
Another problem is overfitting. The greater the number of
hidden neurons, the greater the ability of the network to
recognise existing patterns. However, if the number of
hidden neurons is too big, the network might simply
memorise all training examples.

Neural Networks 30 H. H. Kha

Architecture of a neural network



Neural Networks 31 H. H. Kha
0
1
1
1
0
1
1
1
1
1
41
42
43
44
45
1
2
3
4
5
1
2
3
4
5
0
1
2
3
4
5
6
7
8
9
10
0
0
0
0
0
0
0
1
0

You might also like