http://www.ai-start.com/dl2017/html/lesson2-week2.html
优化算法 (Optimization algorithms)
Mini-batch 梯度下降(Mini-batch gradient descent)
理解mini-batch梯度下降法(Understanding mini-batch gradient descent)
指数加权平均数(Exponentially weighted averages)
理解指数加权平均数(Understanding exponentially weighted averages)
数加权平均的偏差修正(Bias correction in exponentially weighted averages)
动量梯度下降法(Gradient descent with Momentum)
RMSprop
root mean square prop算法
Adam 优化算法(Adam optimization algorithm)
Adam代表的是Adaptive Moment Estimation
Adam优化算法基本上就是将Momentum和RMSprop结合在一起。