You are on page 1of 2

1/4/2018 Coursera | Online Courses From Top Universities.

Join for Free | Coursera

Back to Week 1 Lessons Prev Next

Cost Function
We can measure the accuracy of our hypothesis function by using a cost function. This
takes an average di erence (actually a fancier version of an average) of all the results of
the hypothesis with inputs from x's and the actual output y's.

m m
1 2
1 2
J (θ0 , θ1 ) = ^ −y )
∑ (y = ∑ (hθ (xi ) − y i )
i i
2m 2m
i=1 i=1

To break it apart, it is 1

2
x̄ where x̄ is the mean of the squares of hθ (xi ) − yi , or the
di erence between the predicted value and the actual value.

This function is otherwise called the "Squared error function", or "Mean squared error".
The mean is halved ( 12 ) as a convenience for the computation of the gradient descent, as

the derivative term of the square function will cancel out the term. The following image
1

summarizes what the cost function does:

https://www.coursera.org/learn/machine-learning/supplement/nhzyF/cost-function 1/2
1/4/2018 Coursera | Online Courses From Top Universities. Join for Free | Coursera

Mark as completed

  

https://www.coursera.org/learn/machine-learning/supplement/nhzyF/cost-function 2/2

You might also like