Gradient Descent

The Gradient Descent can be given by:

repeat:{

θj:=θjα/θjJ(θ)θ_j:= θ_j−α ∂/∂θ_j J(θ)

} (simultaneously update for every j=0,1,2….)

On computing the partial differential in the above equation, we get:

repeat:{

θj:=θjα(1/m)i=1m((hθ(x(i))y(i))xj(i))θ_j:=θ_j−α (1/m) ∑_{i=1}^{m}((h_θ(x^{(i)})−y^{(i)})x_{j}^{(i)})

} (simultaneously update every j=0,1,2…,n)

Last updated