Cost Function

We can measure the accuracy of our hypothesis function by using a cost function. This takes an average (actually a fancier version of an average) of all the results of the hypothesis with inputs from x's compared to the actual output y's.

If m is the number of training examples, the cost function for Linear Regression in One Variable is given by:

J(θ0,θ1)=(1/2m)i=1m(hθ(x(i))y(i))2J(θ_0,θ_1) = (1/2m) ∑_{i=1}^{m}(h_θ(x^{(i)})−y^{(i)})^2

Lower values indicate more accuracy.

This function is otherwise called the "Squared error function", or Mean squared error.

We can plot it on a graph taking θ0θ_0 and θ1θ_1 on the x and z axis respectively, and J on the y axis:

Last updated