You are on page 1of 22

An Introduction to

Neural Networks
Purpose and Concept of a Neural
Network
Originally
an attempt to exactly model the
brain on a computer.
Since been enhanced for mathematics.
The next step in artificial intelligence.
Will
enable computers to learn like humans
and be introduced to novel situations.
Connection between Brains and Neural
Networks
Brain has many neurons interconnected.
Neural network has many set biases that are interconnected to other
biases by weights.
As you learn, the brain makes new connections within the neurons
As a Neural Network learns it alters the connections between neurons
Future Implications and Development

Artificial intelligence
Smart robots that can thrive in novel situations
Very smart phones
Economic models using past data
Disease models
Training and Testing Data

Training data is data for which the correct answer is known.


Neural Networks are fed data, produce a result, and then correct their results
based on the known correct answer.
Need lots of training data to create an accurate Neural Network.
After training, you test the usefulness of the Network by feeding in testing
data, but not giving the computer the answer, and then measure accuracy.
Mapping Classes into Feature Space

Each image has many dimensions, in my research each picture has 4096
dimensions.
A number in each dimension gives image a coordinate in the feature space
Since all objects of a certain class are similar but different in small ways, one
class takes up a specific part of feature space.
How A Network Learns

Linear regression and line of best fit


All images are turned into numbers associated with the pixels.
Numbers create a basic coordinate that represent the image.
Create a function that fits all of the points of a certain class in feature space.
Convolutional Neural Networks

Used for classification of images and other picture related objects.


Breaks images into pixel motifs, which make up the image.
Each motif is then used to classify other input images.
Convolutional Neural Network Example
Recurrent Neural Networks

Used in real time to learn future outcomes.


Factors time into the prediction of a future outcome.
Could be used for autocorrect or for speech prediction applications on
phones.
Shortcomings of Neural Networks

Necessity of training data


Overfitting
Inconsistent learning times
Little hardware has been created to support Neural Networks
Overfitting

Neural networks are created to learn and classify trends and objects.
With too big of a training set, Neural Networks begin to learn to
classify only the training set, and not the general trend in data.
Too much training leads to decreased accuracy.
Can be difficult to increase accuracy after a certain point.
Largest limitation of Neural Networks
Necessity of Training Data

Neural Networks need a fairly large training data set, which often is not
practical.
Danger of too little training data and too much training data.
We cant train a computer for everything that it is going to encounter in the
real world, and right now, learning is too slow to allow machines to learn in
real time.
In Conclusion

Neural Networks are a fairly new technology, and a large step in Artificial
Intelligence.
Neural Networks work somewhat like mathematically enhanced brains.
Still somewhat inconsistent in learning time and accuracy.
Hard to tell when training data is too little or too much.
Networks are being enhanced to introduce computers to an unpredictable and
changing reality.
The Structure of a Neural Network
Weights and Biases

In a linear function, use the example y = wx + b, where w = weight and b =


bias.
Weights make small changes to the shape of the function.
Biases shift the whole function.
If there is a consistent factor that is changing the weights, it is better to shift
the whole function by changing the bias than to keep having to change the
weights.
Weights and Biases Review

Weights and biases are the way that Neural Networks


learn.
By comparing output results to the correct answers,
weights are changed to produce more accurate results.
Shifting biases account for consistent noise in the data
that is effecting results.
Weights

Connections between neurons or in this case, biases.


Like the tuners of a violin, small changes to improve the accuracy of
classification.
Calculating Error Margins

In order for the neural network to learn, it must be able to calculate


the gradient between the predicted result, and what the real result
should have been.
After the error is calculated, this number is used to change the
weights of the neural network to create a more accurate result.
Cost functions are used to predict error and then optimizers are used
to fix the weights.
Backpropagation
Thank You!

You might also like