Continuing the Coursera course said inĀ Gradient Descent and Partial Derivatives, a new idea, regularization, was introduced. To short, regularization decreases theta (or weights in a neural network) to make the function “simpler” and more vague.

…

Skip to content# gundamMC's Workshop

## "Isn't it pretty to think so?"

#
Tag / calculus

# Regularization in Machine Learning

# Gradient Descent and Partial Derivatives

Continuing the Coursera course said inĀ Gradient Descent and Partial Derivatives, a new idea, regularization, was introduced. To short, regularization decreases theta (or weights in a neural network) to make the function “simpler” and more vague.

…

Over the past few days I’ve been taking the machine learning course on Coursera by Stanford University. The professor, Andrew Ng, explained gradient descent in a relatively easier way to understand:

(For some reason the course’s lectures are also on Youtube…)

…