You are on page 1of 16

Course Objectives

Understanding engineering models (artificial


neural networks / machine learning) of
cognitive functions for smart machine

EE 538 Neural Networks


Lecture 1

Feature extraction
Clustering
Separation
Classification/Recognition
Prediction
Motor control
and more

Introduction
Biological Neuron Models

Soo-Young Lee
Department of Electrical Engineering /
Brain Science Research Center
KAIST
EE538 Neural Networks

Fall 2015

1-1

EE538 Neural Networks

Related Topics

Connections between biological and artificial neural


networks

Learning-based Systems
Neural Networks (Connectionist Model)

Utilize neural network models to benchmark/competition


problems

Probabilistic Learning

EE538 Neural Networks

Language
Understanding
Knowledge
Development

Fall 2015

Related to computational models of brain functions


(Computational Neuroscience, not to heuristic rules)
Learn from hints

Term-Project

Machine Learning

Big Data/
Data Mining

1-2

Course Emphasis

Artificial Intelligence
Rule-based
Systems

Fall 2015

Pattern/Image
Recognition

Speech
Recognition

1-3

EE538 Neural Networks

1-4

Approaches

Contents

Lecture
Homework (20%) and Quiz (10%)
Mid-Term Exam (20%)
Course Participation (10%)
Term Project (Proposal 15%, Report 35%)

Background
Course Overview
Biological Neural Networks
Artificial Neural Networks

Repeat what others had done


Add your own ideas

EE538 Neural Networks

Fall 2015

1-5

EE538 Neural Networks

1-6

International CES

3D
Bigger
More pixels
Faster
Smaller
Distinct color

SMART, SMART, and SMART!


EE538 Neural Networks

1-7

EE538 Neural Networks

1-8

EE538 Neural Networks

1-9

EE538 Neural Networks

1-10

Historical Sketch
Dankoon and Pygmalion
Pre-1940: von Hemholtz, Mach, Pavlov, etc.
General theories of learning, vision, conditioning
No specific mathematical models of neuron operation

1940s: Hebb, McCulloch and Pitts


Mechanism for learning in biological neurons
Neural-like networks can compute any arithmetic
function

1950s: Rosenblatt, Widrow and Hoff


First practical networks and learning rules

EE538 Neural Networks

1-11

EE538 Neural Networks

1-12

Korean BrainTech21 Program

Historical Sketch

(Brain Engineering / Neuroinformatics: 1998-2008)

1960s-1970s: Dark age


Minsky and Papert demonstrated limitations of artificial
neural networks
Progress continues, although at a slower pace

Pattern
Recognition

Inference

Feature
Extraction

Sensor

Important new developments caused a resurgence

Late 1980s - Mid 2010s: Continuing Developments

Sensor

New algorithms and applications

Feature
Extraction

Mid-2010s Today : Second Renaissance

EE538 Neural Networks

Vision
Selective
Attention

Mid 1980s: First Renaissance

Recogn./
Tracking

Speech
Recognition
1-13

Artificial Cognitive Systems (2009-2014)


Active Cognitive Development and Situation Awareness
based on Cognitive Science and Information Technology

Learning/
Memory/
Language

Behavior

Perception/
Planning

Recog./
Understand

Selective
Attention

Smart, Smart, and Smart applications!


Big data (in Internet and mobile networks) and cheap
multicore processors (GPUs)

Cognition &
Inference

Olf.

Individual/
Group
Behavior
Body
Control

Fusion

Touch

Brain-like
System

Smell

Auditory

EE538 Neural Networks

1-14

Example: Auditory Processing


Feature Extraction

Classification/
Recognition
Prediction

Sound Localization &


Speech Enhancement
(Separation)

Audio
Video

/
Top-Down Attention

EE538 Neural Networks

1-15

EE538 Neural Networks

1-16

ICA Features from Natural Images

Speech Features at Cochlea


(J.H.Lee, et.al., 2000)

ICA on speech signals (x=c1f1+c2f2++cMfM)


Gabor-like features both in time and frequency domain
Time Domain
Frame

Frequency Domain

EE538 Neural Networks

1-17

EE538 Neural Networks

1-18

Signal Separation (T. Kim, et al.)

ICA-based Complex Features


(T. Kim, et.al., 2005)

Recording environment

Recorded in reverberant real


conference room
Sampling rate: 8 kHz (down
sampled from 44.1 kHz)
3 human speakers are speaking
and 1 loud speaker is playing a
hip-hop music simultaneously

Recorded microphone signals

Time Onset/Offset
Multifrequency Tone
(Timbre)
Frequency Modulation

(.wav)

(.wav)

Each training sample

(.wav)

Separated output signals

MelFrequency

(.wav)

(.wav)

(.wav)

Time Frame
EE538 Neural Networks

1-19

EE538 Neural Networks

1-20

Top-Down Attention

Classification/Recognition
Speech Recognition

Training Patterns (32x32 pixels)

Each training sample

MelFrequency

Image Recognition

EE538 Neural Networks

Deformed Test Patterns


Time Frame

1-21

EE538 Neural Networks

Prediction

1-22

Motor Control

Stock price prediction

EE538 Neural Networks

1-23

EE538 Neural Networks

1-24

Contents

Neural Network Models

Background
Course Overview
Biological Neural Networks
Artificial Neural Networks

We need introduce
Neuron models
Network architectures
Learning algorithms
Mathematical models

Applications to engineering problems


Signal/image processing and recognition
Language understanding and knowledge
development

of artificial neural networks


EE538 Neural Networks

1-25

EE538 Neural Networks

Top
of
the World

Neural Networks
Biologically-Oriented Electronic Systems
Neuroinformatic DB, BioMedical Imaging

Cognitive
Bionic Life
Science
/Neuroscience

1-26

IT (EECS)

Biologically-Inspired Electronic Systems


Digital Brain (Neural Networks)
EE538 Neural Networks

1-27

BiS554
EE538 Neural Networks

1-28

Research Modality

Research Scope
Data
Measurement

Mathematical Applications/
Modeling
Implementation

Brain-like Functional Systems

Systems
Neuromorphic Chips

Behavior

Neural
Networks

System
Circuits

Mathematical Model

Neuron

Analysis Software

Membrane

Neuroscience Data

Protein

Measurement Technology

Gen

Molecules
EE538 Neural Networks

1-29

EE538 Neural Networks

Research Tools

1-30

Contents

Mathematics

Optimization
Linear algebra
Differential equation
Statistics / Information theory

Background
Course Overview
Biological Neural Networks
Artificial Neural Networks

Computer Programming
Hardware Devices
Silicon-based VLSI
MEMS
EE538 Neural Networks

1-31

EE538 Neural Networks

1-32

Brain vs. Computer

Cortical Parameters

Human Brain
about 10^10 to 10^11 neurons
about 10^3 to 10^4 synapses for each neuron
about 10^14 synapses
about 1.5 kg (2% of body weight)
about 1-10 msec time scale

While computers just conduct programmed


instructions, human brain interact with and
learns from environments.
EE538 Neural Networks

1-33

Neuron and
Synapses

EE538 Neural Networks

1-35

EE538 Neural Networks

1-34

Different Types of Ion Channels

EE538 Neural Networks

1-36

Ion Channels and Concentrations

EE538 Neural Networks

1-37

Action
Potential

EE538 Neural Networks

1-38

Contents

EE538 Neural Networks

Fall 2015

1-39

Background
Course Overview
Biological Neural Networks
Artificial Neural Networks

EE538 Neural Networks

1-40

Neural Network Models

How to Conduct Researches ?

Neuron model

Set hypothesis, make model, and evaluate the model.


Hypothesis: best educated guess

Integrate-and-fire
Saturating non-linearity
Rectified Linear Unit

Knowledge on neuroscience

Model: network architecture, mathematical


equations

Network architecture
Feedforward
Layered, Convolutive, Max-Pooling
Recurrent

Knowledge on mathematics (linear algebra, statistics,


differential equations, etc.)

Evaluation

Learning algorithm

Experiment and/or simulation


Find parameters
Performance measure and decision criteria

Unsupervised learning
Supervised learning
Reinforcement learning
EE538 Neural Networks

1-41

EE538 Neural Networks

1-42

Action Potential

Example: Neuron Model


Hypothesis:
Interaction between membrane potential and ion
concentration with voltage-dependent ion channels

Model: Hodgkin-Huxley model


Evaluation
Voltage clamp experiment
Find parameters

EE538 Neural Networks

1-43

EE538 Neural Networks

1-44

Basic BioPhysics

Hodgkin-Huxley model

Ficks law: diffusion

Ohms law: drift

J diffusion D

J drift Zn

Einstein relationship

dn
dx

dV
dx

kT
q

Equllibrium

kT dn
dV
J J diffusion J drift
Zn
0
q
dx
dx

EE538 Neural Networks

1-45

EE538 Neural Networks

1-46

Ion Parameters

Nernst Potential
Nernst potential
Vi

E Vi Vo dV
Vo

EE538 Neural Networks

kT
qZ

ni

kT

no

n dn qZ ln n

no

1-47

EE538 Neural Networks

1-48

Resting Potential (1)

Resting Potential (2)

Goldman equation for multi-ions


For one-type ions
J J diffusion J drift

where

kT dn
dV

Zn
P
dx
q dx

For Potassium and Chlorine ions

dn qZ

Vm n
dx kT

dV Vm
kT D

, P

dx

J Cl

qVm PK [ K ]i exp(qVm / kT ) [ K ]o

1 exp(qVm / kT )
kT

qVm PCl [Cl ]i exp( qVm / kT ) [Cl ]o

1 exp(qVm / kT )
kT

Goldman equation for Potassium and Chlorine ions

ni

kT ni
1
1
dn, 1
dn

qZ
qZVm no J kT n
no J
0
PVm n
kT
qZPVm
kT

J qZPV no
n exp( ZqVm / kT ) no
qVm
qZ
m
, J
ln
PVm i
Z

1 exp( ZqVm / kT )
kT
kT
J kT n

i
qZPVm

dx P

EE538 Neural Networks

JK

1-49

Hodgkin-Huxley Experiment

Vm

kT PK [ K ]o PCl [Cl ]i
ln
q PK [ K ]i PCl [Cl ]o

Goldman equation for Potassium, Sodium, and Chlorine


ions ?
Do we need any other ion transfer mechanism?
EE538 Neural Networks

1-50

H-H Voltage Clamp Experiment

Voltage-clamping

EE538 Neural Networks

1-51

EE538 Neural Networks

1-52

HH Model: Refractory Period (1)

EE538 Neural Networks

HH Model: Refractory Period (2)

1-53

Integrate-and-Fire Neuron
Leaky integrator

Membrane potential with a constant current before


spikes
u ( 0) t / m
u (t ) RI 1 1
e

RI

I (t ) w j (t tij )
i

Firing threshold

u (tij )

Reset voltage

lim u j (tij ) urest

1-54

Response of IF Neurons

du (t )
u (t ) RI (t )
dt
j

EE538 Neural Networks

Refractory period

EE538 Neural Networks

1-55

EE538 Neural Networks

1-56

I-O Transfer Function


ti m ln

Noise models of IF-neurons

RI

Noise sources

urest RI

Stochastic threshold
Random reset
Noisy integration (noisy input)

if ti tref then ti 1 ti ti
otherwise ti 1 ti tref
or
ti 1 ti tref m ln

RI
urest RI

1
tref m ln

RI
urest RI

EE538 Neural Networks

1-57

Neural Coding

EE538 Neural Networks

1-58

Population Neuron Model

vk

wkj x j bk

j 1

y k ( vk )
EE538 Neural Networks

1-59

EE538 Neural Networks

1-60

Activation Function
1 if vk 0
yk
0 if vk 0

Network Architecture

Determinisitic Neuron

Stochastic Neuron

Fully-connected

Feed-forward

1 with probabili ty P(v)


x
1 (or 0) with probabili ty 1-P(v)
(vk )

1
1 exp(avk )

P( v )

1
1 exp(v / T )

EE538 Neural Networks

Layered Feed-forward
1-61

EE538 Neural Networks

Learning Algorithms

1-62

Summary

While computers just conduct programmed


instructions, human brain learns from environments.
Supervised Learning
Learn from teachers

Unsupervised Learning
Learn from environments without teacher
Active (Self-searching for what need be learnt)
Need objective(s)

Reinforcement Learning
Critique

EE538 Neural Networks

Convolutive

1-63

The genetic code was cracked in the mid-20th century. The


neural code will be cracked in the mid-21st century.
Intelligence to Machine!
Freedom to Mankind!
Neural code may be learnt by learning from data.
Unsupervised learning
Supervised learning
Reinforcement learning
Need define
Neuron model
Network architecture
Learning algorithm
EE538 Neural Networks

1-64

You might also like