Unit 05, Video 02 - Gradient Descent
This video covers the Gradient Descent materials of chapter 4 of our course textbook, "Hands On Machine Learning". In this video we look at how we can use the cost function to determine the parameters that give the best fit of a line to a set of data. In the previous video we looked briefly at the normal equation, which is an exact solution to determine the best parameters that minimzie the cost. Gradient descent is a type of iterative optimization algorithm. Given a cost function, like the RMSE we have introduced, and a way to calculate the gradients of the functions, we can iteratively take small steps down the gradient to find parameters that give the minimum cost. Once we have described the gradient descent learning algorithm, we discuss various flavors of gradient descent, including full batch, mini-batch and stochastic gradient descent.
https://bitbucket.org/dharter/ml-python-class
This video covers the Gradient Descent materials of chapter 4 of our course textbook, "Hands On Machine Learning". In this video we look at how we can use the cost function to determine the parameters that give the best fit of a line to a set of data. In the previous video we looked briefly at the normal equation, which is an exact solution to determine the best parameters that minimzie the cost. Gradient descent is a type of iterative optimization algorithm. Given a cost function, like the RMSE we have introduced, and a way to calculate the gradients of the functions, we can iteratively take small steps down the gradient to find parameters that give the minimum cost. Once we have described the gradient descent learning algorithm, we discuss various flavors of gradient descent, including full batch, mini-batch and stochastic gradient descent.
https://bitbucket.org/dharter/ml-python-class
- Категория
- 3d принтер своими руками
Комментариев нет.