Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

  • 4.9
Approx. 22 hours to complete

Course Summary

This course teaches you how to build deep neural networks and improve your deep learning skills. You will learn about convolutional networks, recurrent networks, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, and more.

Key Learning Points

  • Gain a deep understanding of deep neural networks and their applications
  • Learn how to build various types of neural networks from scratch
  • Improve your deep learning skills and stay up-to-date with the latest techniques

Job Positions & Salaries of people who have taken this course might have

    • USA: $112,000
    • India: ₹1,200,000
    • Spain: €42,000
    • USA: $112,000
    • India: ₹1,200,000
    • Spain: €42,000

    • USA: $117,000
    • India: ₹1,000,000
    • Spain: €36,000
    • USA: $112,000
    • India: ₹1,200,000
    • Spain: €42,000

    • USA: $117,000
    • India: ₹1,000,000
    • Spain: €36,000

    • USA: $138,000
    • India: ₹2,400,000
    • Spain: €56,000

Related Topics for further study


Learning Outcomes

  • Build and train deep neural networks from scratch
  • Understand the inner workings of neural networks and how to optimize them
  • Apply deep learning techniques to real-world problems

Prerequisites or good to have knowledge before taking this course

  • Basic knowledge of Python programming
  • Familiarity with calculus and linear algebra

Course Difficulty Level

Intermediate

Course Format

  • Online self-paced course
  • Video lectures with quizzes and programming assignments

Similar Courses

  • Machine Learning
  • Neural Networks and Deep Learning
  • Applied Data Science with Python

Related Education Paths


Related Books

Description

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically.

Outline

  • Practical Aspects of Deep Learning
  • Train / Dev / Test sets
  • Bias / Variance
  • Basic Recipe for Machine Learning
  • Regularization
  • Why Regularization Reduces Overfitting?
  • Dropout Regularization
  • Understanding Dropout
  • Other Regularization Methods
  • Normalizing Inputs
  • Vanishing / Exploding Gradients
  • Weight Initialization for Deep Networks
  • Numerical Approximation of Gradients
  • Gradient Checking
  • Gradient Checking Implementation Notes
  • Yoshua Bengio Interview
  • Connect with your Mentors and Fellow Learners on Discourse!
  • Clarification about Upcoming Regularization Video
  • Clarification about Upcoming Understanding Dropout Video
  • Lectures in PDF
  • H​ow to Refresh your Workspace
  • Practical aspects of Deep Learning
  • Optimization Algorithms
  • Mini-batch Gradient Descent
  • Understanding Mini-batch Gradient Descent
  • Exponentially Weighted Averages
  • Understanding Exponentially Weighted Averages
  • Bias Correction in Exponentially Weighted Averages
  • Gradient Descent with Momentum
  • RMSprop
  • Adam Optimization Algorithm
  • Learning Rate Decay
  • The Problem of Local Optima
  • Yuanqing Lin Interview
  • Clarification about Upcoming Adam Optimization Video
  • Clarification about Learning Rate Decay Video
  • Lectures in PDF
  • Optimization Algorithms
  • Hyperparameter Tuning, Batch Normalization and Programming Frameworks
  • Tuning Process
  • Using an Appropriate Scale to pick Hyperparameters
  • Hyperparameters Tuning in Practice: Pandas vs. Caviar
  • Normalizing Activations in a Network
  • Fitting Batch Norm into a Neural Network
  • Why does Batch Norm work?
  • Batch Norm at Test Time
  • Softmax Regression
  • Training a Softmax Classifier
  • Deep Learning Frameworks
  • TensorFlow
  • Clarification about Upcoming Normalizing Activations in a Network Video
  • Clarifications about Upcoming Softmax Video
  • (Optional) Learn about Gradient Tape and More
  • Lectures in PDF
  • References
  • Acknowledgments
  • Hyperparameter tuning, Batch Normalization, Programming Frameworks

Summary of User Reviews

This course on deep neural networks is highly-rated among users for its comprehensive coverage of the subject matter and practical exercises. Many users especially appreciated the clear explanations and real-world examples provided throughout the course.

Key Aspect Users Liked About This Course

Clear explanations and real-world examples

Pros from User Reviews

  • Comprehensive coverage of deep neural networks
  • Practical exercises to reinforce learning
  • Clear explanations and real-world examples
  • Engaging and knowledgeable instructors
  • Great for beginners and intermediate learners

Cons from User Reviews

  • Some users found the course content too basic
  • Lack of advanced topics
  • No hands-on projects
  • Some technical issues with the platform
  • Limited interaction with instructors
English
Available now
Approx. 22 hours to complete
Andrew Ng Top Instructor, Kian Katanforoosh Top Instructor, Younes Bensouda Mourri Top Instructor
DeepLearning.AI
Coursera

Instructor

Share
Saved Course list
Cancel
Get Course Update
Computer Courses