Backpropagation And Gradient Descent In Neural Networks | Neural Network Tutorial | Simplilearn
Вставка
- Опубліковано 4 чер 2019
- 🔥Artificial Intelligence Engineer Program (Discount Coupon: YTBE15): www.simplilearn.com/artificia...
🔥Professional Certificate Program In AI And Machine Learning: www.simplilearn.com/pgp-ai-ma...
This video on backpropagation and gradient descent will cover the basics of how backpropagation and gradient descent plays a role in training neural networks - using an example on how to recognize the handwritten digits using a neural network. After predicting the results, you will see how to train the network using backpropagation to obtain the results with high accuracy. Backpropagation is the process of updating the parameters of a network to reduce the error in prediction. You will also understand how to calculate the loss function to measure the error in the model. Finally, you will see with the help of a graph, how to find the minimum of a function using gradient descent.
🔥Free Deep Learning Course: www.simplilearn.com/introduct...
To learn more about Deep Learning, subscribe to our UA-cam channel: ua-cam.com/users/Simplile...
To access the slides, click here: www.slideshare.net/Simplilear...
To access the slides, click here: www.slideshare.net/Simplilear...
Watch more videos on Deep Learning: • What is Deep Learning?...
#BackpropagationAndGradientDescent #BackpropagationInNeuralNetworks #Backpropagation #BackpropagationAlgorithm #BackpropagationExample #DeepLearningTutorial #DataScience #SimplilearnDeepLearning #DeepLearningCourse
Simplilearn’s Deep Learning course will transform you into an expert in deep learning techniques using TensorFlow, the open-source software library designed to conduct machine learning & deep neural network research. With our deep learning course, you'll master deep learning and TensorFlow concepts, learn to implement algorithms, build artificial neural networks and traverse layers of data abstraction to understand the power of data and prepare you for your new role as deep learning scientist.
With Simplilearn’s Deep Learning course, you will prepare for a career as a Deep Learning engineer as you master concepts and techniques including supervised and unsupervised learning, mathematical and heuristic aspects, and hands-on modeling to develop algorithms. Those who complete the course will be able to:
1. Understand the concepts of TensorFlow, its main functions, operations and the execution pipeline
2. Implement deep learning algorithms, understand neural networks and traverse the layers of data abstraction which will empower you to understand data like never before
3. Master and comprehend advanced topics such as convolutional neural networks, recurrent neural networks, training deep networks and high-level interfaces
4. Build deep learning models in TensorFlow and interpret the results
5. Understand the language and fundamental concepts of artificial neural networks
6. Troubleshoot and improve deep learning models
7. Build your own deep learning project
8. Differentiate between machine learning, deep learning and artificial intelligence
We recommend this deep learning online course particularly for the following professionals:
1. Software engineers
2. Data scientists
3. Data analysts
4. Statisticians with an interest in deep learning
Learn more at: www.simplilearn.com/deep-lear...
🔥🔥 Interested in Attending Live Classes? Call Us: IN - 18002127688 / US - +18445327688
For more information about Simplilearn’s courses, visit:
- Facebook: / simplilearn
- LinkedIn: / simp. .
- Website: www.simplilearn.com
Get the Android app: bit.ly/1WlVo4u
Get the iOS app: apple.co/1HIO5J0
Do you have any questions on this topic? Please share your feedback in the comment section below and we'll have our experts answer it for you. Thanks for watching the video. Cheers!
I think you mixed up "positive slope" with "negative slope". "A negative slope means that two variables are negatively related; that is, when x increases, y decreases, and when x decreases, y increases"
Phone call for a little more time to do the job
After wasting several hours trying to understand these 2 concepts, this video has finally explained exceptionally well. Thank you.
I love these short segments on Deep Learning please keep them coming!
We are glad you found our video helpful, Drue. Like and share our video with your peers and also do not forget to subscribe to our channel for not missing video updates. We will be coming up with more such videos. Cheers!
Are there still have more than two of the other guys
this video is awesome.but there are some mistakes at 7:53 (should be negative slope) and at 8:21 (weight needs to be increase instead reduce)
Thank you so much for bringing this to our attention. We reported this right away to the relevant department.
Kind to get to see the dog I am going out and get the one for my cat
Great video! But I do have some questions.
1) At 1:20 Why don't some of the probabilities add up to 1?
2) At 6:11 Isn't the slope negative? From what I understand, the slope indicates how a line changes at a point. At 6:11 the line is going down, so the slope would be negative. At 6:17 the line is going up, so the slope would be positive. If I remember correctly from Google's ML crash course, we move in the opposite direction of the slope (multiply by -1).
i got the same questions and I think you are right about 2), but idk either why 1) happened
So if we can do a few minutes before the meeting
Im confused , I think you mixed up a positive and negative slopes here . Positive slope should decrease in the weight and vice versa
According to the example mentioned in the video, Positive slope means increase in the weight and vice versa. It is correct though.
@rainyy22 You are correct
Did I go with my friends to see the dog
Agreed.
nice ! i wish explain how exactly weights to change in back-propagation.
Hi, Thanks for the feedback. We shall share your concerns with the concerned department.
For a good morning to you have no plans today and I’m sorry
yes exactly, I watched the whole video to understand that only, however, it is not there
Amazing lesson.. very clear and helpful explanation
WooHoo! We are so happy you love our videos. Please do keep checking back in. We put up new videos every week on all your favorite topics. Whenever you have the time, you must also check out our blog page @simplilearn.com and tell us what you think. Have a good day!
Go back and try it again I love you
really good video video
however i keep feeling like some of the most essential things are being left out. Namely what actually happens when the weights are being "updated". lets say whe have the three losses like in your example:
1.0.49
2.0.25
3.0.04
Now when i update the weights using these losses, what is acutally happening?
Am i going through all the weigth and saying w*l. And in that case which of the three losses?
Or isit being subtracted or added, or what in the world is going on???
How to get your dog to a vegetable garden with the cat in your nose
so a point on the graph is the result of 1 training example? so if we only have 1 example we wouldn't know a gradient because you need 2 points for that, right? so we can't train the net with 1 image?
How do you know if it is a positive slope or a negative slope?
Nice explanation
Keep watching
Could be the first to do a little more time but the only one is to be back
Why is the sound so bad on this video?
We are sorry about that Stephanie, we will share the feedback with the relevant department
Jet I think that is a little bit better if it was the one to get
Great.
We are glad you found our video helpful. Like and share our video with your peers and also do not forget to subscribe to our channel for not missing video updates. We will be coming up with more such videos. Cheers!
Thanks a lot. Really nice video.
Most welcome!
Love the game I think the one for a few levels but it’s not like it has been crashing
This video is ML for housewives, does not actually explain the math behind doing gradient descent on a neural network, just the concept
Thanks for watching our video and sharing your thoughts. Do subscribe to our channel and stay tuned for more. Cheers!