This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.
We extend the previous binary classification model to multiple classes using the softmax function, and we derive the very important training method called "backpropagation" using first principles. I show you how to code backpropagation in Numpy, first "the slow way", and then "the fast way" using Numpy features.
Next, we implement a neural network using Google's new TensorFlow library.
You should take this course if you are interested in starting your journey toward becoming a master at deep learning, or if you are interested in machine learning and data science in general. We go beyond basic models like logistic regression and linear regression and I show you something that automatically learns features.
This course provides you with many practical examples so that you can really see how deep learning can be used on anything. Throughout the course, we'll do a course project, which will show you how to predict user actions on a website given user data like whether or not that user is on a mobile device, the number of products they viewed, how long they stayed on your site, whether or not they are a returning visitor, and what time of day they visited.
Another project at the end of the course shows you how you can use deep learning for facial expression recognition. Imagine being able to predict someone's emotions just based on a picture!
After getting your feet wet with the fundamentals, I provide a brief overview of some of the newest developments in neural networks - slightly modified architectures and what they are used for.
If you already know about softmax and backpropagation, and you want to skip over the theory and speed things up using more advanced techniques along with GPU-optimization, check out my follow-up course on this topic, Data Science: Practical Deep Learning Concepts in Theano and TensorFlow.
I have other courses that cover more advanced topics, such as Convolutional Neural Networks, Restricted Boltzmann Machines, Autoencoders, and more! But you want to be very comfortable with the material in this course before moving on to more advanced subjects.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples
In the directory: ann_class
Make sure you always "git pull" so you have the latest version!
HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:
TIPS (for getting through the course):
USEFUL COURSE ORDERING:
Overview of the course and prerequisites.Introduction and Outline Preview 03:45 Where does this course fit into your deep learning studies? Preview 04:57 Deep Learning Readiness Test 05:33
An almost purely qualitative description of neural networks.Neural Networks with No Math Preview 04:20 Introduction to the E-Commerce Course Project 08:52 + – Classifying more than 2 things at a time 14 Lectures 01:19:58 Prediction: Section Introduction and Outline 05:39 From Logistic Regression to Neural Networks 05:12 Interpreting the Weights of a Neural Network 08:05
What's the function we use to classify more than 2 things?Softmax 02:54 Sigmoid vs. Softmax 01:30 Feedforward in Slow-Mo (part 1) 19:42 Feedforward in Slow-Mo (part 2) 10:55 Where to get the code for this course 01:30
How do we code the softmax in Python?Softmax in Code 03:39
Let's extend softmax and code the entire calculation from input to output.Building an entire feedforward neural network in Python 06:23 E-Commerce Course Project: Pre-Processing the Data 05:24 E-Commerce Course Project: Making Predictions 03:55 Prediction Quizzes 03:25 Prediction: Section Summary 01:45 + – Training a neural network 12 Lectures 01:23:46 Training: Section Introduction and Outline 02:49 What do all these symbols and letters mean? 09:45 What does it mean to "train" a neural network? 06:15
Derivation of backpropagation from first principles. Defining the objective function, taking the log, and differentiating the log with respect to weights in each layer.Backpropagation Intro 11:53
A further look into backpropagation.Backpropagation - what does the weight update depend on? 04:47
Backpropagation for deeper networks, exposing the structure, and how to code it more efficiently.Backpropagation - recursiveness 04:37
How to code bacpropagation in Python using numpy operations vs. slow for loops.Backpropagation in code 17:07 The WRONG Way to Learn Backpropagation 03:52 E-Commerce Course Project: Training Logistic Regression with Softmax 08:11 E-Commerce Course Project: Training a Neural Network 06:19 Training Quiz 05:30 Training: Section Summary 02:41 + – Practical Machine Learning 9 Lectures 42:27 Practical Issues: Section Introduction and Outline 01:43
What are the donut and XOR problems again?Donut and XOR Review 01:06
We look again at the XOR and donut problem from logistic regression. The features are now learned automatically.Donut and XOR Revisited 04:21 Neural Networks for Regression 11:38
sigmoid, tanh, relu along with their derivativesCommon nonlinearities and their derivatives 01:26 Practical Considerations for Choosing Activation Functions 07:45
Tips on choosing learning rate, regularization penalty, number of hidden units, and number of hidden layers.Hyperparameters and Cross-Validation 04:10 Manually Choosing Learning Rate and Regularization Penalty 04:08 Practical Issues: Section Summary 06:10 + – TensorFlow, exercises, practice, and what to learn next 6 Lectures 41:35
A look at Google's new TensorFlow library.TensorFlow plug-and-play example 07:31 Visualizing what a neural network has learned using TensorFlow Playground 11:35 What did you learn? What didn't you learn? Where can you learn more? Where to go from here 03:41 You know more than you think you know 04:52 How to get good at deep learning + exercises 05:07 Deep neural networks in just 3 lines of code with Sci-Kit Learn 08:49 + – Project: Facial Expression Recognition 8 Lectures 01:02:12 Facial Expression Recognition Project Introduction 04:51 Facial Expression Recognition Problem Description 12:21 The class imbalance problem 06:01 Utilities walkthrough 05:45 Facial Expression Recognition in Code (Binary / Sigmoid) 12:13 Facial Expression Recognition in Code (Logistic Regression Softmax) 08:57 Facial Expression Recognition in Code (ANN Softmax) 10:44 Facial Expression Recognition Project Summary 01:20 + – Backpropagation Supplementary Lectures 5 Lectures 30:30 Backpropagation Supplementary Lectures Introduction 01:03
Register to leave a review