This short writeup is from Udacity's Deep Learning Nanodegree. Journey to the Bottom of the Valley Here I'll give you a little refresher on gradient descent so we can start training our network with MiniFlow. Remember that our goal is to make our network output as close as possible to the target values by minimizing the cost. You can envision the cost as a hill or mountain and we want to get to the bottom. Imagine your model parameters are represented by a ball sitting on a hill. Intuitively, we want to push the ball downhill. And that makes sense, ...

Read More

Read More

This post is based on the linear regression live coding session from Siraj Raval (Udacity). What is Linear Regression? Linear regression is a widely used method to find a predictor variable which describes an outcome variable. In simpler words, it describes the relationship of two value sets. In this example, Siraj shows how to do Linear Regression using gradient descent. Gradient descent is an iterative optimization algorithm widely used in machine learning. In simpler words, we use it to make our linear regression model as precise as possible. I won't get into details about the code, if you are interested ...

Read More

Read More

Ok so here it is, the full program. I'm so excited it's hard to describe. Program Structure The Deep Learning Nanodegree Foundation program is divided into five parts covering various topics in deep learning. The first two parts are available immediately upon starting the program. The other parts are released every two weeks. Introduction The first part is an introduction to the program as well as a couple lessons covering tools you'll be using. You'll also get a chance to apply some deep learning models to do cool things like transferring the style of artwork to another image. Weâ€™ll start ...

Read More

Read More