Alexander Wong
https://alexanderwong.com/
Recent content on Alexander Wong
Hugo  gohugo.io
enus
© 2017 Alexander Wong
Mon, 02 Oct 2017 00:00:00 +0000

Projects
https://alexanderwong.com/projects/
Mon, 02 Oct 2017 00:00:00 +0000
https://alexanderwong.com/projects/
Udia Tasks (2017) Task and Goal tracking application. JavaScript (ReactJS, Redux, ReduxSaga) Python (Django, Django Rest Framework, Django Allauth) Demo, Github (server), Github (client) Atlas (2016) Hackathon project for AngularAttack 2016. Visualize and interact with World Data Bank api. TypeScript (AngularJS, NodeJS, C3.js) Github Picknic (2016) Hackathon project for HackED 2016. MEAN stack application for finding good picnic spots. JavaScript (NodeJS, AngularJS, Express) MongoDB Github MapCore (2016) Hadoop application for storing and generating real time traffic data Java Cloudera Hadoop Github SocialButterfly (2015) Real time chat application with user authentication JavaScript (Meteor.

About
https://alexanderwong.com/about/
Sat, 26 Aug 2017 00:00:00 +0000
https://alexanderwong.com/about/
My name is Alexander Wong. I occasionally write software. I am the CEO of Udia Software Incorporated.
When I’m not working on Udia, I spend my time running or hiking. In the winter, I ski. I enjoy playing my guitar and listening to folk music. Some of my long term goals include living out of a van to travel around the continent and building an earthship sustainable home.
I believe that pursuit of meaning in life is noble and that this is an automatable task.

Chatbot
https://alexanderwong.com/chatbot/
Fri, 06 Oct 2017 00:00:00 +0000
https://alexanderwong.com/chatbot/
You can ask me a few things through my personal chatbot! Some commands include:
Tell me about yourself. What are your hobbies? What are some of your strengths?

Structuring Machine Learning Projects, Week 1
https://alexanderwong.com/post/structuringmachinelearningprojectsweek1/
Mon, 01 Jan 2018 12:21:37 0600
https://alexanderwong.com/post/structuringmachinelearningprojectsweek1/
Taking the Coursera Deep Learning Specialization, Structuring Machine Learning Projects course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Table of Contents ML Strategy Introduction to ML Strategy Why ML Strategy Orthogonalization Setting Up Your Goal Single Number Evaluation Metric Satisficing and Optimizing Metric Train/Dev/Test Distributions Size of the Dev and Test Sets When to Change Dev/Test Sets and Metrics Comparing to HumanLevel Performance Why Humanlevel Performance?

Improving Deep Neural Networks, Week 3
https://alexanderwong.com/post/improvingdeepneuralnetworksweek3/
Wed, 20 Dec 2017 10:21:37 0600
https://alexanderwong.com/post/improvingdeepneuralnetworksweek3/
Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Improving Deep Neural Networks, Week 2.
Table of Contents Hyperparameter Tuning, Batch Normalization, and Programming Frameworks Hyperparameter Tuning Tuning Process Using an appropriate scale to pick hyperparameters Hyperparameters tuning in practice: Pandas vs Caviar Batch Normalization Normalizing activations in a network Fitting Batch Normalization into a neural network Why does Batch Normalization Work?

Improving Deep Neural Networks, Week 2
https://alexanderwong.com/post/improvingdeepneuralnetworksweek2/
Sun, 17 Dec 2017 15:21:37 0600
https://alexanderwong.com/post/improvingdeepneuralnetworksweek2/
Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Improving Deep Neural Networks, Week 1.
Table of Contents Optimization Algorithms MiniBatch Gradient Descent Understanding Minibatch Gradient Descent Exponentially Weighted Averages Understanding Exponentially Weighted Averages Bias Correction in Exponentially Weighted Averages Gradient Descent with Momentum RMSprop Adam Optimization Algorithm Learning Rate Decay The Problem of Local Optima Optimization Algorithms MiniBatch Gradient Descent Rather than training on your entire training set during each step of gradient descent, break out your examples into groups.

Improving Deep Neural Networks, Week 1
https://alexanderwong.com/post/improvingdeepneuralnetworksweek1/
Fri, 08 Dec 2017 15:21:37 0600
https://alexanderwong.com/post/improvingdeepneuralnetworksweek1/
Taking the Coursera Deep Learning Specialization, Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Neural Networks and Deep Learning.
Table of Contents Practical Aspects of Deep Learning Setting Up Your Machine Learning Application Train/Dev/Test Sets Bias/Variance Basic Recipe for Machine Learning Regularizing your Neural Network Regularization Why regularization reduces overfitting?

Neural Networks and Deep Learning, Week 4
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek4/
Sat, 02 Dec 2017 15:21:37 0600
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek4/
Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Week 3.
Table of Contents Deep Neural Networks Deep Neural Network Deep Llayer neural network Forward Propagation in a Deep Network Getting your matrix dimensions right Why deep representations?

Neural Networks and Deep Learning, Week 3
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek3/
Wed, 22 Nov 2017 15:21:37 0600
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek3/
Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Week 2.
Table of Contents Shallow Neural Networks Shallow Neural Network Neural Networks Overview Neural Network Representation Computing a Neural Network’s Output Vectorizing Across Multiple Examples Explanation for Vectorized Implementation Activation Functions Why do you need nonlinear activation functions?

Neural Networks and Deep Learning, Week 2
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek2/
Sat, 18 Nov 2017 15:21:37 0600
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek2/
Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Assumes you have knowledge of Week 1.
Table of Contents Neural Networks Basics Logistic Regression as a Neural Network Binary Classification Logistic Regression Logistic Regression Cost Function Gradient Descent Derivatives More Derivatives Examples Computation Graph Derivatives with a Computation Graph Logistic Regression Gradient Descent Gradient Descent on m Examples Python and Vectorization Vectorization More Vectorization Examples Vectorizing Logistic Regression Vectorizing Logistic Regression’s Gradient Output Broadcasting in Python Note on Python/NumPy Vectors Neural Networks Basics Logistic Regression as a Neural Network Binary Classification Binary classification is basically answering a yes or no question.

Neural Networks and Deep Learning, Week 1
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek1/
Sat, 11 Nov 2017 15:21:37 0600
https://alexanderwong.com/post/neuralnetworksanddeeplearningweek1/
Taking the Coursera Deep Learning Specialization, Neural Networks and Deep Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng. See deeplearning.ai for more details.
Table of Contents Introduction to Deep Learning What is a Neural Network Supervised Learning with Neural Networks Why is Deep Learning Taking Off? About this Course Optional: Heroes of Deep Learning (Geoffrey Hinton) Introduction to Deep Learning There are five courses in the Coursera Deep Learning Specialization.

Machine Learning, Week 11
https://alexanderwong.com/post/courseramachinelearningweek11/
Fri, 03 Nov 2017 11:21:37 0600
https://alexanderwong.com/post/courseramachinelearningweek11/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 10.
Table of Contents Application Example: Photo OCR Photo OCR Problem Description and Pipeline Sliding Windows Getting Lots of Data and Artificial Data Ceiling Analysis: What Part of the Pipeline to Work on Next Lecture notes: Lecture18 Application Example: Photo OCR Photo OCR Problem Description and Pipeline Photo OCR (Object Character Recognition) is the task of trying to recognize objects, characters (words and digits) given an image.

Machine Learning, Week 10
https://alexanderwong.com/post/courseramachinelearningweek10/
Sun, 29 Oct 2017 19:21:37 0600
https://alexanderwong.com/post/courseramachinelearningweek10/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 9.
Table of Contents Large Scale Machine Learning Gradient Descent with Large Datasets Learning With Large Datasets Stochastic Gradient Descent MiniBatch Gradient Descent Stochastic Gradient Descent Convergence Advanced Topics Online Learning Map Reduce and Data Parallelism Lecture notes: Lecture17 Large Scale Machine Learning Gradient Descent with Large Datasets Learning With Large Datasets One of the best ways to get a high performance machine learning system is to supply a lot of data into a low bias (overfitting) learning algorithm.

Machine Learning, Week 9
https://alexanderwong.com/post/courseramachinelearningweek9/
Sun, 22 Oct 2017 14:21:37 0600
https://alexanderwong.com/post/courseramachinelearningweek9/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 8.
Table of Contents Anomoly Detection Density Estimation Problem Motivation Gaussian Distribution Algorithm Building an Anomaly Detection System Developing and Evaluating an Anomaly Detection System Anomaly Detection vs. Supervised Learning Choosing What Features to Use Multivariate Gaussian Distribution Algorithm Reccomender Systems Predicting Movie Ratings Problem Forumulation Content Based Recommendations Collaborative Filtering Collaborative Filtering Algorithm Low Rank Matrix Factorization Vectorization: Low Rank Matrix Factorization Implementational Detail: Mean Normalization Lecture notes: Lecture15 Lecture16 Anomoly Detection Density Estimation Problem Motivation Imagine being a manufacturor of aircraft engines.

Machine Learning, Week 8
https://alexanderwong.com/post/courseramachinelearningweek8/
Sat, 14 Oct 2017 09:21:37 0600
https://alexanderwong.com/post/courseramachinelearningweek8/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 7.
Table of Contents Unsupervised Learning Clustering Introduction KMeans Algorithm Optimization Objective Random Initialization Choosing the Number of Clusters Dimensionality Reduction Motivation Data Compression Visualization Principal Component Analysis Principal Component Analysis Problem Formulation Principal Component Analysis Algorithm Applying PCA Reconstruction from Compressed Representation Choosing the Number of Principal Components Advice for Applying PCA Lecture notes: Lecture13 Lecture14 Unsupervised Learning Clustering Introduction Unsupervised learning is the class of problem solving where when given a set of data with no labels, find structure in the dataset.

Four States of Being
https://alexanderwong.com/post/fourstatesofbeing/
Thu, 05 Oct 2017 21:12:55 0600
https://alexanderwong.com/post/fourstatesofbeing/
There are only four states of being, or identity. Awareness, I (self), Dream (other), Universe (all).
Awareness This is the state you are born into, the state of being when you are first conscious of external stimuli. As an entity with awareness, the only requirement is one can acknowledge receiving some form of flow, or energy.
Examples: A baby crying. An insect navigating around.
I (SelfAwareness) This is the state in which you begin to recognize one self.

Machine Learning, Week 7
https://alexanderwong.com/post/courseramachinelearningweek7/
Wed, 04 Oct 2017 15:38:02 0600
https://alexanderwong.com/post/courseramachinelearningweek7/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 6.
Table of Contents Support Vector Machines Large Margin Classification Optimization Objective Large Margin Intuition Kernels Source Vector Machines (in Practice) Lecture notes: Lecture12 Support Vector Machines Large Margin Classification Optimization Objective We are simplifying the logistic regression cost function by converting the sigmoid function into two straight lines, as shown here:

Machine Learning, Week 6
https://alexanderwong.com/post/courseramachinelearningweek6/
Wed, 20 Sep 2017 15:38:02 0600
https://alexanderwong.com/post/courseramachinelearningweek6/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 5.
Table of Contents Advice for Applying Machine Learning Evaluating a Learning Algorithm Evaluating a Hypothesis Model Selection and Train/Validation/Test Sets Diagnosing Bias versus Variance Regularization and Bias/Variance Learning Curves Deciding What to Do Next Machine Learning System Design Building a Spam Classifier Prioritizing What to Work On Error Analysis Machine Learning Practical Tips How to Handle Skewed Data When to Utilize Large Data Sets Lecture notes: Lecture10 Lecture11 Advice for Applying Machine Learning Evaluating a Learning Algorithm Evaluating a Hypothesis Once we have done some trouble shooting for errors in our predictions by:

Machine Learning, Week 5
https://alexanderwong.com/post/courseramachinelearningweek5/
Mon, 18 Sep 2017 15:38:02 0600
https://alexanderwong.com/post/courseramachinelearningweek5/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 4.
Table of Contents Neural Networks: Learning Cost Function and Backpropagation Cost Function Backpropagation Algorithm Backpropagation Intuition Backpropagation in Practice Implementation Note: Unrolling Parameters Gradient Checking Random Initialization Putting it Together Application of Neural Networks Autonomous Driving Lecture notes: Lecture9 Neural Networks: Learning Cost Function and Backpropagation Cost Function Let’s define a few variables that we will need to use.

Machine Learning, Week 4
https://alexanderwong.com/post/courseramachinelearningweek4/
Tue, 12 Sep 2017 12:47:44 0600
https://alexanderwong.com/post/courseramachinelearningweek4/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 3.
Table of Contents Neural Networks: Representation Motivations Nonlinear Hypothesis Neurons and the Brain Neural Networks Model Representation I Model Representation II Applications Examples and Intuitions I Examples and Intuitions II Multiclass Classification Lecture notes: Lecture8 Neural Networks: Representation Motivations Nonlinear Hypothesis Neural networks are another learning algorithm that exist in addition to linear regression and logistic regression.

Machine Learning, Week 3
https://alexanderwong.com/post/courseramachinelearningweek3/
Thu, 07 Sep 2017 00:04:44 0600
https://alexanderwong.com/post/courseramachinelearningweek3/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 2.
Table of Contents Logistic Regression Classification and Representation Classification Hypothesis Representation Decision Boundary Logistic Regression Model Cost Function Simplified Cost Function and Gradient Descent Advanced Optimization Multiclass Classification Multiclass Classification: Onevsall Regularization Solving the Problem of Overfitting The Problem of Overfitting Cost Function Regularized Linear Regression Regularized Logistic Regression Lecture notes: Lecture6 Lecture7 Logistic Regression Classification and Representation Classification Recall that classification involves a hypothesis function which returns a discontinuous output (common example was whether or not a tumor was benign or cancerous based on size).

Machine Learning, Week 2
https://alexanderwong.com/post/courseramachinelearningweek2/
Thu, 31 Aug 2017 14:05:35 0600
https://alexanderwong.com/post/courseramachinelearningweek2/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Assumes you have knowledge of Week 1.
Table of Contents Linear Regression with Multiple Variables Multivariate Linear Regression Multiple Features Gradient Descent for Multiple Variables Gradient Descent in Practice  Feature Scaling & Mean Normalization Gradient Descent in Practice  Learning Rate Features and Polynomial Regression Computing Parameters Analytically Normal Equation Normal Equation Noninvertibility Optional Octave/MatLab Tutorial Octave Tutorial Basic Operations Moving Data Around Computing on Data Plotting Data Functions & Control Statements: for, while, if/elseif/else Vectorization Lecture notes: Lecture4 Lecture5 Linear Regression with Multiple Variables Multivariate Linear Regression Multiple Features Linear regression with multiple variables is known as Multivariate Linear Regression.

Machine Learning, Week 1
https://alexanderwong.com/post/courseramachinelearningweek1/
Thu, 31 Aug 2017 10:25:51 0600
https://alexanderwong.com/post/courseramachinelearningweek1/
Taking the Coursera Machine Learning course. Will post condensed notes every week as part of the review process. All material originates from the free Coursera course, taught by Andrew Ng.
Table of Contents Introduction Machine Learning What is Machine Learning Supervised Learning Unsupervised Learning Linear Regression with One Variable Model Representation Cost Function & Intuitions Gradient Descent Gradient Descent for Linear Regression Optional Linear Algebra Linear Algebra Review Matrices and Vectors Matrix Addition and Scalar Operations MatrixVector Multiplication MatrixMatrix Multiplication Matrix Multiplication Properties Inverse and Transpose Lecture notes: Lecture1 Lecture2 Lecture3 Introduction Machine Learning What is Machine Learning Arthur Samuel (1959): The field of study that gives computers the ability to learn without explicitly programmed.

Hello World
https://alexanderwong.com/post/helloworld/
Sat, 12 Aug 2017 14:25:51 0600
https://alexanderwong.com/post/helloworld/
Although as individuals we are always in transition, I believe it is necesary to have static markers representing our current state in reference to the universe. We are mortal, we were born into this world, and inevitably we will eventually die. Consider this post to mark the beginning of a quarter life transition. I figure that the best way to measure and document this is to have an open, publicly available record of my goals and my journey in pursuing meaning in life.