Professional Documents
Culture Documents
Feature extraction
Clustering
Separation
Classification/Recognition
Prediction
Motor control
and more
Introduction
Biological Neuron Models
Soo-Young Lee
Department of Electrical Engineering /
Brain Science Research Center
KAIST
EE538 Neural Networks
Fall 2015
1-1
Related Topics
Learning-based Systems
Neural Networks (Connectionist Model)
Probabilistic Learning
Language
Understanding
Knowledge
Development
Fall 2015
Term-Project
Machine Learning
Big Data/
Data Mining
1-2
Course Emphasis
Artificial Intelligence
Rule-based
Systems
Fall 2015
Pattern/Image
Recognition
Speech
Recognition
1-3
1-4
Approaches
Contents
Lecture
Homework (20%) and Quiz (10%)
Mid-Term Exam (20%)
Course Participation (10%)
Term Project (Proposal 15%, Report 35%)
Background
Course Overview
Biological Neural Networks
Artificial Neural Networks
Fall 2015
1-5
1-6
International CES
3D
Bigger
More pixels
Faster
Smaller
Distinct color
1-7
1-8
1-9
1-10
Historical Sketch
Dankoon and Pygmalion
Pre-1940: von Hemholtz, Mach, Pavlov, etc.
General theories of learning, vision, conditioning
No specific mathematical models of neuron operation
1-11
1-12
Historical Sketch
Pattern
Recognition
Inference
Feature
Extraction
Sensor
Sensor
Feature
Extraction
Vision
Selective
Attention
Recogn./
Tracking
Speech
Recognition
1-13
Learning/
Memory/
Language
Behavior
Perception/
Planning
Recog./
Understand
Selective
Attention
Cognition &
Inference
Olf.
Individual/
Group
Behavior
Body
Control
Fusion
Touch
Brain-like
System
Smell
Auditory
1-14
Classification/
Recognition
Prediction
Audio
Video
/
Top-Down Attention
1-15
1-16
Frequency Domain
1-17
1-18
Recording environment
Time Onset/Offset
Multifrequency Tone
(Timbre)
Frequency Modulation
(.wav)
(.wav)
(.wav)
MelFrequency
(.wav)
(.wav)
(.wav)
Time Frame
EE538 Neural Networks
1-19
1-20
Top-Down Attention
Classification/Recognition
Speech Recognition
MelFrequency
Image Recognition
1-21
Prediction
1-22
Motor Control
1-23
1-24
Contents
Background
Course Overview
Biological Neural Networks
Artificial Neural Networks
We need introduce
Neuron models
Network architectures
Learning algorithms
Mathematical models
1-25
Top
of
the World
Neural Networks
Biologically-Oriented Electronic Systems
Neuroinformatic DB, BioMedical Imaging
Cognitive
Bionic Life
Science
/Neuroscience
1-26
IT (EECS)
1-27
BiS554
EE538 Neural Networks
1-28
Research Modality
Research Scope
Data
Measurement
Mathematical Applications/
Modeling
Implementation
Systems
Neuromorphic Chips
Behavior
Neural
Networks
System
Circuits
Mathematical Model
Neuron
Analysis Software
Membrane
Neuroscience Data
Protein
Measurement Technology
Gen
Molecules
EE538 Neural Networks
1-29
Research Tools
1-30
Contents
Mathematics
Optimization
Linear algebra
Differential equation
Statistics / Information theory
Background
Course Overview
Biological Neural Networks
Artificial Neural Networks
Computer Programming
Hardware Devices
Silicon-based VLSI
MEMS
EE538 Neural Networks
1-31
1-32
Cortical Parameters
Human Brain
about 10^10 to 10^11 neurons
about 10^3 to 10^4 synapses for each neuron
about 10^14 synapses
about 1.5 kg (2% of body weight)
about 1-10 msec time scale
1-33
Neuron and
Synapses
1-35
1-34
1-36
1-37
Action
Potential
1-38
Contents
Fall 2015
1-39
Background
Course Overview
Biological Neural Networks
Artificial Neural Networks
1-40
Neuron model
Integrate-and-fire
Saturating non-linearity
Rectified Linear Unit
Knowledge on neuroscience
Network architecture
Feedforward
Layered, Convolutive, Max-Pooling
Recurrent
Evaluation
Learning algorithm
Unsupervised learning
Supervised learning
Reinforcement learning
EE538 Neural Networks
1-41
1-42
Action Potential
1-43
1-44
Basic BioPhysics
Hodgkin-Huxley model
J diffusion D
J drift Zn
Einstein relationship
dn
dx
dV
dx
kT
q
Equllibrium
kT dn
dV
J J diffusion J drift
Zn
0
q
dx
dx
1-45
1-46
Ion Parameters
Nernst Potential
Nernst potential
Vi
E Vi Vo dV
Vo
kT
qZ
ni
kT
no
n dn qZ ln n
no
1-47
1-48
where
kT dn
dV
Zn
P
dx
q dx
dn qZ
Vm n
dx kT
dV Vm
kT D
, P
dx
J Cl
qVm PK [ K ]i exp(qVm / kT ) [ K ]o
1 exp(qVm / kT )
kT
1 exp(qVm / kT )
kT
ni
kT ni
1
1
dn, 1
dn
qZ
qZVm no J kT n
no J
0
PVm n
kT
qZPVm
kT
J qZPV no
n exp( ZqVm / kT ) no
qVm
qZ
m
, J
ln
PVm i
Z
1 exp( ZqVm / kT )
kT
kT
J kT n
i
qZPVm
dx P
JK
1-49
Hodgkin-Huxley Experiment
Vm
kT PK [ K ]o PCl [Cl ]i
ln
q PK [ K ]i PCl [Cl ]o
1-50
Voltage-clamping
1-51
1-52
1-53
Integrate-and-Fire Neuron
Leaky integrator
RI
I (t ) w j (t tij )
i
Firing threshold
u (tij )
Reset voltage
1-54
Response of IF Neurons
du (t )
u (t ) RI (t )
dt
j
Refractory period
1-55
1-56
RI
Noise sources
urest RI
Stochastic threshold
Random reset
Noisy integration (noisy input)
if ti tref then ti 1 ti ti
otherwise ti 1 ti tref
or
ti 1 ti tref m ln
RI
urest RI
1
tref m ln
RI
urest RI
1-57
Neural Coding
1-58
vk
wkj x j bk
j 1
y k ( vk )
EE538 Neural Networks
1-59
1-60
Activation Function
1 if vk 0
yk
0 if vk 0
Network Architecture
Determinisitic Neuron
Stochastic Neuron
Fully-connected
Feed-forward
1
1 exp(avk )
P( v )
1
1 exp(v / T )
Layered Feed-forward
1-61
Learning Algorithms
1-62
Summary
Unsupervised Learning
Learn from environments without teacher
Active (Self-searching for what need be learnt)
Need objective(s)
Reinforcement Learning
Critique
Convolutive
1-63
1-64