All Posts

Structuring Machine Learning Projects, Week 1

Taking the Coursera Deep Learning Specialization, Structuring Machine Learning Projects course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Table of Contents ML Strategy Introduction to ML Strategy Why ML Strategy Orthogonalization Setting Up Your Goal Single Number Evaluation Metric Satisficing and Optimizing Metric Train/Dev/Test Distributions Size of the Dev and Test Sets When to Change Dev/Test Sets and Metrics Comparing to Human-Level Performance Why Human-level Performance?

Improving Deep Neural Networks, Week 3

Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Improving Deep Neural Networks, Week 2. Table of Contents Hyperparameter Tuning, Batch Normalization, and Programming Frameworks Hyperparameter Tuning Tuning Process Using an appropriate scale to pick hyperparameters Hyperparameters tuning in practice: Pandas vs Caviar Batch Normalization Normalizing activations in a network Fitting Batch Normalization into a neural network Why does Batch Normalization Work?

Improving Deep Neural Networks, Week 2

Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Improving Deep Neural Networks, Week 1. Table of Contents Optimization Algorithms Mini-Batch Gradient Descent Understanding Mini-batch Gradient Descent Exponentially Weighted Averages Understanding Exponentially Weighted Averages Bias Correction in Exponentially Weighted Averages Gradient Descent with Momentum RMSprop Adam Optimization Algorithm Learning Rate Decay The Problem of Local Optima Optimization Algorithms Mini-Batch Gradient Descent Rather than training on your entire training set during each step of gradient descent, break out your examples into groups.

Improving Deep Neural Networks, Week 1

Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Neural Networks and Deep Learning. Table of Contents Practical Aspects of Deep Learning Setting Up Your Machine Learning Application Train/Dev/Test Sets Bias/Variance Basic Recipe for Machine Learning Regularizing your Neural Network Regularization Why regularization reduces overfitting?

Neural Networks and Deep Learning, Week 4

Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Week 3. Table of Contents Deep Neural Networks Deep Neural Network Deep L-layer neural network Forward Propagation in a Deep Network Getting your matrix dimensions right Why deep representations?

Neural Networks and Deep Learning, Week 3

Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Week 2. Table of Contents Shallow Neural Networks Shallow Neural Network Neural Networks Overview Neural Network Representation Computing a Neural Network’s Output Vectorizing Across Multiple Examples Explanation for Vectorized Implementation Activation Functions Why do you need non-linear activation functions?

Neural Networks and Deep Learning, Week 2

Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Assumes you have knowledge of Week 1. Table of Contents Neural Networks Basics Logistic Regression as a Neural Network Binary Classification Logistic Regression Logistic Regression Cost Function Gradient Descent Derivatives More Derivatives Examples Computation Graph Derivatives with a Computation Graph Logistic Regression Gradient Descent Gradient Descent on m Examples Python and Vectorization Vectorization More Vectorization Examples Vectorizing Logistic Regression Vectorizing Logistic Regression’s Gradient Output Broadcasting in Python Note on Python/NumPy Vectors Neural Networks Basics Logistic Regression as a Neural Network Binary Classification Binary classification is basically answering a yes or no question.

Neural Networks and Deep Learning, Week 1

Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details. Table of Contents Introduction to Deep Learning What is a Neural Network Supervised Learning with Neural Networks Why is Deep Learning Taking Off? About this Course Optional: Heroes of Deep Learning (Geoffrey Hinton) Introduction to Deep Learning There are five courses in the Coursera Deep Learning Specialization.

Machine Learning, Week 11

Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. Assumes you have knowledge of Week 10. Table of Contents Application Example: Photo OCR Photo OCR Problem Description and Pipeline Sliding Windows Getting Lots of Data and Artificial Data Ceiling Analysis: What Part of the Pipeline to Work on Next Lecture notes: Lecture18 Application Example: Photo OCR Photo OCR Problem Description and Pipeline Photo OCR (Object Character Recognition) is the task of trying to recognize objects, characters (words and digits) given an image.

Machine Learning, Week 10

Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. Assumes you have knowledge of Week 9. Table of Contents Large Scale Machine Learning Gradient Descent with Large Datasets Learning With Large Datasets Stochastic Gradient Descent Mini-Batch Gradient Descent Stochastic Gradient Descent Convergence Advanced Topics Online Learning Map Reduce and Data Parallelism Lecture notes: Lecture17 Large Scale Machine Learning Gradient Descent with Large Datasets Learning With Large Datasets One of the best ways to get a high performance machine learning system is to supply a lot of data into a low bias (overfitting) learning algorithm.