Deep Learning Udacity Nanodegree Review

I have recently graduated from the Deep Learning Nanodegree at Udacity and I wanted to share my thoughts:

  • Motivation
  • Who is this for?
  • Course structure
  • Conclusion


I have been a self-taught mobile developer for more than 6 years, it has always been a hobby for me.

I am also a Computer Science Engineer. My last year at the university I spent a long time in the hospital, I couldn’t properly focus on Artificial Intelligence (AI from now on), in order to finish my degree in time and not repeat my last year, I had to quickly study on my own and focus only on approving the exams without properly focussing on AI.

2020 has been an ups and down year, I have been working from home more than ever and I thought that it would be a good opportunity to grasp my AI knowledge as I am saving more than one hour in commuting time every day.

My end goal is to become and AI expert, going back to university for a master degree whilst having a full-time job is a no go for me at this moment. I started to research about online learning platforms, I came across different courses at many places, but most of them were more academic ones. I am a person who likes to study by practicing with real examples.

I found Udacity stood-up amongst every platform, it offers a great community, real-world examples, private coaching, career services, every project is reviewed by a real person, full flexibility whilst you are having a full-time job. I thought, this is what I need! I found 2 Nanodegrees that are aligned with my career goals.

I realised that I don’t have the requirements to start with any of them. Luckily, Udacity also offers the Deep Learning Nanodegree which helps you learn the foundations for a more complex AI Nanodegree.

Who is this for?

The Nanodegree is specifically created for anyone who is interested in machine learning, AI and/or deep learning with programming experience. In order to successfully graduate, you must have an intermediate knowledge in Python, most specifically NumPy and Pandas. Some calculus and linear algebra are required, I recommend brushing-up derivatives and matrix operations.

Course Structure

What software will you use and/or learn?

  • Jupyter notebooks.
  • Anaconda.
  • OpenCV.
  • PyTorch.
  • NumPy.
  • Pandas, Scipy.
  • AWS, the final project will be to deploy your own PyTorch model using SageMaker.

How long will it take?

The Nanodegree estimated time is 4 months at 12/hrs per week. I enrolled in the 4 months access, so I had to finish it in that time, otherwise I would have to pay an extra month if I didn’t finish it in time. In my case having a full-time job makes things very difficult for studying, I had to spend many weekends just studying the Nanodegree. I even had to take a whole week off just to focus on some more difficult parts (RNNs was hard, really hard, more on this later).

What I wish I had known before starting?

Even though the Nanodegree is a beginner friendly program, I recommend buying this amazing book Grokking Deep Learning (by Andrew W. Trask) and go through the chapters one to six before even starting the Nanodegree. Udacity will advice you to buy it when you are enrolled in and will offer a discount. If you are like me, a developer who had forgotten how to make derivatives, matrixes calculations and who needs step by step explanations, this book is your friend, it will be a good companion for the whole course.

Tip: I found it cheaper on Amazon.

Part 1: Introduction to Deep Learning

You will be introduced to deep learning by applying style transfer to images, you will also get some experience using Anaconda and Jupyter notebooks.

Tip: I installed Jupyter and Anaconda as recommended, but I found it easier to use Google Colab they even offer you free GPUs and TPUs when you need raw power for complex problems.

Part 2: Neural Networks

Here is where the party really starts, this module is divided in 5 lessons and by the end of it you will build your first Neural Network from scratch!

The project will consist on predicting the number of bikeshare users on a given day.

Introduction to Neural NetworksYou will learn the foundations of deep learning and neural networks. Mostly, multi layer perceptron (MLP).
Implementing Gradient DescentIn this lesson you will implement gradient descent and backpropagation in python using NumPy and matrix multiplications.
Training Neural NetworksHere you will learn many techniques to improve training performance.
GPU Workspace DemoThis is just a simple demonstration of the GPU workspace used in Udacity.
Sentiment AnalysisThe author of Grokking Deep Learning will teach you to create a neural network for sentiment analysis

This was one of the hardest parts during the whole project, I spent a lot of time grasping my head around backpropagation.

Tip: Do not progress if you don’t fully understand backpropagation, you must know it like 2+2=4. Further modules will assume you understand it, believe me, you must know it! Udacity will offer plenty resources to learn about it, but you can also ask the community or get 1-to-1 help.

Part 3: Convolutional Neural Networks (CNNs)

I have to admit this was my favourite module, it was the easiest to understand, I flown through it! The project will consist on building and training a CNN to identify and estimate dog-breeds, pretty fun!

Convolutional Neural NetworksYou will learn the basics of CNNs and how to improve performance in image classification.
GPU Workspace DemoAnother Udacity workspace demo.
Cloud ComputingHow to use AWS for training neural networks on GPU
Transfer LearningYou will learn how to apply pre-trained networks to a new problem using transfer learning.
Weight InitializationIn this lesson, you’ll learn how to to find good weights for a neural network.
AutoencodersThis is a pretty fun lesson where you learn about autoencoders using PyTorch, image de-noising and dimensionality reduction.
Style TransferYou will learn how to use a pre-trained network to extract content and style features from an image.

There will be an optional project for improving your GitHub profile, a member of the Udacity team will review your profile and will tell you points to improve and start building a portfolio!

Tip: If you feel stuck with the project, feel free to check my solution, but remember, perseverance is the key!

Part 4: Recurrent Neural Networks (RNNs)

This was the hardest part of the whole Nanodegree for me, I really had a hard time understanding it and I still have some areas of how LSTMs Networks internally work that I don’t fully comprehend.

Fortunately, the project (Generate TV Scripts) doesn’t require to fully understand the internals of LSTMs networks, PyTorch has everything implemented for you to use it!

Recurrent Neural NetworksHere you will learn how memory can be incorporated into deep learning models using RNNs.
Long Short-Term Memory Networks (LSTMs)You will learn how LSTM will help with preserving long term memory.
Implementing of RNN & LSTMHere you will learn how to represent memory in code. Then you will define and train a RNN in PyTorch with sequential data.
HyperparametersIn order to succeed with neural training hyperparameter tuning is a must-known area you should understand. In this lesson you will learn the intuitions to succeed.
Embeddings & Word2VecYou will learn about embeddings (instead of using one-hot encoding) with Word2Vec
Sentiment Prediction RNNSmall project for sentiment analysis prediction using RNNs

Tip: Practice, practice and practice. Get 1-to-1 mentoring if needed and use the community for any doubts you have. Feel free to check my project if needed.

Part 5: Generative Adversarial Networks (GANs)

This was also one of my favourite parts of the Nanodegree, every section was an amazing experience, but for some reason CNNs and GANs were easier to get my head around. Also, this section is full of fun projects!

Generative Adversarial NetworksThe inventor of GANs Ian Goodfellow will introduce you to these models. You will also implement a project with GANs on the MNIST dataset to generate hand-written numbers!
Deep Convolutional GANsYou will implement a Deep Convolutional GAN using CNNs with complex colour images of house numbers, I wonder if Google uses this in Google maps?
Pix2Pix & CycleGANHere you will get an introduction for CycleGANs and Pix2Pix, you will learn how to transfer style from one image to another, an example of it will be to make a horse to look like a zebra.
Implementing a CycleGANFun project to translate images from summer to winter and visceversa

At the end of this section, there will be an optional project where you will need to improve your LinkedIn profile, a member of the Udacity team will give you tips to improve it.

Tip: If you feel stuck with the project, feel free to check my solution and visit my LinkedIn profile for more tips.

Part 6: Deploying a Model

I felt this was the longest part of the journey, not because of the content, but because of using Amazon services, due to the limited time I had for studying and being careful when working with Amazon SageMaker, I had to start and turn-off my notebook every time I was able to study, this took some time.

At the end of this module you will implement a web app where you can type either a positive or negative movie review, this review will be sent to your model (you will have to build this!) and it will tell you if it is a negative or positive one, how cool is that!

Introduction to DeploymentThis will be a long of terminology for cloud computing, you may skip it if you already know about it. You will also learn how deployment works with machine learning.
Building a Model using SageMakerYou will use SageMaker to build a model using the XGBoost algorithm to predict house prices in Boston. You will also use what is called a low and high level approach in SageMaker.
Deploying and Using a ModelThis is a fun project where you deploy a model using SageMaker and communicate with it with a web app.
Hyperparameter TuningHere you will use SageMaker automatic hyperparameter tuning tools to find the best model for the projects Boston housing prices and sentiment analysis.
Updating a ModelYou will learn how to automatic update your model without shutting down your current deployed one.

Tip: You may check my solution in case you need some help with the project.


It was a very interesting and sometimes very demanding experience (I have a full time job and family). It also got me back into studying something new, being a developer is hard to get involved in other tech areas.

I strongly recommend this Nanodegree to anyone interested in AI. I totally agree with Andrew Ng with AI is the new electricity It is a growing market that will be there in the long term and at some point it doesn’t matter what you do, AI will be there!

One last thing, just make sure that before applying you have enough time to commit to it. If you don’t have any python, math or some basic AI knowledge, I would recommend you to grasp your python and math knowledge. Once the clock starts you will have 4 months to finish it, if you require more time you will have to pay more to complete it. It won’t be easy and sometimes you will need more than the recommended 12 hours per week. Overall, It is a great experience and you will have so much fun with all the projects and the supportive community!

What’s next for me?

Being part of the Udacity community has been great and I will continue with my learning, currently I am enrolled in:

  • AI For Healthcare Nanodegree!
  • TensorFlow for Mobile Devices
  • I have given the opportunity to participate in the Bertelsmann Technology Scholarship program for the AI Product Manager!

I am looking forward to the next four months!

2 thoughts on “Deep Learning Udacity Nanodegree Review

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: