Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI
Learned the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; Used standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implemented and applied a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.