Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm
Вставка
- Опубліковано 5 жов 2024
- Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm
#gradientdescent #unfolddatascience
Hello All,
My name is Aman and I am a data scientist. In this video I explain gradient descent piece by piece. In this video, my intention is to make gradient descent extremely simple to understand. Gradient descent being a very important algorithm for machine learning and deep learning is a must know topic for every data scientist. Below questions are answered in this video:
1. What is gradient descent?
2. How gradient descent works?
3. Gradient descent algorithm?
4. What is gradient descent in machine learning?
5. What is gradient descent in deep learning?
6. How gradient descent algorithm works?
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
Join Facebook group :
www.facebook.c...
Follow on medium : / amanrai77
Follow on quora: www.quora.com/...
Follow on twitter : @unfoldds
Get connected on LinkedIn : / aman-kumar-b4881440
Follow on Instagram : unfolddatascience
Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging and Boosting here:
• Introduction to Ensemb...
Access all my codes here:
drive.google.c...
Have question for me? Ask me here : docs.google.co...
My Music: www.bensound.c...
My question is when we calculate Partial derivative with respect to 'c' and 'm' ,we should consider one as constant.For example
to calculate partial derivative of cost function J with respect to c ∂J/∂c ,we should consider 'm' as constant .So the above calculation should be like this. -2[2 - (c+m)] + (-2)[4-(c+3m)] => -2[2-(c)]+(-2)[4-(c)] => -2[2] -2[4] =>-4-8=> -12.
Please confirm
Yep when we calculate w.r.t c m is const and vice versa.
Hi Anjani...why it is -2[2-(c+m)] as derivative od [2-(c+m.1)]^2 . dont you think it should be 2[2-(c+m)] from the derivation rule.
why -2 i still didnt get it... it should be ... 2[2-(c)]+2[4-(c)] right?
Could you please elaborate on your derivative method @Anjani Kumar . I guess the value -4 in the video is correct
Why -2?
This is the first time I'm learning about Gredient Desent, and I understood how algorithms work. This video is amazing. Thank you so much.
You're very welcome Pankaj.
Hi Aman sir, i am a PhD scholar (almost completed) from IIT Madras and since last few months just for my interest i was exploring DS, ML and DL (though i am not from CS background) and landed with some of your videos on UDS which really increase my curiosity to learn more about it. Though i have explored a lot online videos and many other sources on ML including corsera etc. but i can say that your explanations are extremely good for conceptually to understand the subject. Just i am not able to control myself without appreciating your great effort and you are doing really a great job/help for the aspirants of ML/DS. Thank a lot for all what you are doing.
Your comments are my motivation Brij. Thanks a lot.
Thanks for your feedback
11111qqqqq1qpq
Congratulations
Went through lots of articles but didn't understand the core. But your video made it clear within 15 minutes :) Just awesome keep up the good work :)
Thanks Aparna, your comments are very valuable for me.
This is one of the best explanation video about Gradient Descent, I like your detailed explaination. Looking forward for more videos on various Optimizers.
Thank you
Thanks a lot Bala. Yes will create on those topics as well.
I like how you simply talk and explain .Especially, your speed is enough for everyone to understand such a tough concept like gradient descent.
THE best explanation so far, short consice and precise, Algorithm, minimimizing loss function by having the optimal parameters
the info you have given at arount 7:00 was very insightful. it shows why gradient always points in the direction of steepest ascent. thank you
I was confused in the gradient raw concept Your video helped me to understand that
thanks for such an informative video
Great great lecture for gradient descent I have ever seen....thank u so much for sharing ur knowledge sir ❤️
So nice of you Mani. your comments mean a lot.
Why do I understood everything... Thankyou so much sir ❤️❤️❤️❤️❤️.
You're just amazing! Anyone can understand gradient descend by watching this video. Thanks!
Sir my suggestion is that .. explain all topics as per JNTUH btech syllabus ..U will get so many views for sure👍 ur explanation is extraordinary
Thank you for motivation. :), I will check with university 🤣
New value=old value-learning rate * slope
this made me understand the whole concept within seconds.
Thank you Sir!
Welcome Aaron.
Your UA-cam channel has been extremely helpful in preparing for job interviews. I just landed a data science job at McKinsey. Thank you!
Best of luck Charlotte.
This is the best explanation that I have seen on Gradient descent, i mean, I already had a great idea of what it was but you took me to a different level. Thank you!
Glad it was helpful Chris :)
couldn't agree more..😂😂
I was recently impressed by the deeplizard explaining these concepts....now these content makes me feel more safer than i could have imagined..😀😀
u r wonderful teacher i have ever seen
Wow!Awesome Explanation I was not getting it after studying it 3-4 days but after watching your video it seems pretty easy to me
Thanks for watching.
Simply wow. After a month I understood today Gradient Descent. Thank you soo much for the video 😊
Welcome Himanshi. Means a lot to me.
bhai ji itne pyar se padhaya hai maja agaya!!!
Thanks Manu.
I have gone through lots of explanations and it was not understood. But through this video, i got the confidence in continue my learning forward. supeerr sir, thank you
Keep watching
This finally helped me to understand Gradient Descent. Thank You.
Thanks a lot.
i don't know what this world will be without indians youtubers. thank you very much at least i got something
Glad you found it helpful
Damnn... I don't know how far your knowledge extends but I am finding very easy to understand with fun real world example. I am your new subscriber. 🎉❤️
Wow, thanks Aaron.
Your teaching method is masterful! None of the books I read go to such depths. Thanks!!
You're very welcome!
Well explained tomorrow I have my project
Thanks Gayatri.
great ... simple and best. Thank you.
One of the best explanation of gradient descent. Thank you so much. Very informative
Thanks Surajit.
very beneficial video thnk you so much love form pakistan
Thanks Usman. Stay safe. Tc
The first time I understand this concept, thank you so much!!!
Glad to hear I was able to help
Thank you for a clear and detailed explanation of this topic
Glad it was helpful!
Excellent !!! I have liked the video very much!!! You taught and asked question exactly what I was looking for! And I was looking for such explanation. Go ahead,sir.
You are welcome Evan. Your comments are my motivation.
Great explanation
Awesome ... very simple explanation
Thank you. Happy learning. Tc
You are a real master -- Liked, Subscribed, Shared!
Thanks a lot Said.
Same 🔥
Great work sir,i finally understood from your video.Thanks a lot
You are most welcome
You are just under-rated aman after krish and joshua You are the one who students need to follow you in Data Science Community...srsly
Thanks Mohini for motivating me.
Awesome video. This is the best explanation. Please make more videos.
Thanks Madhurima.
This is just amazing.... simple explanation of a complex thing.
Thanks Ahmad for watching.
Thank you for explaining it in simple language. Wonderful.
You are welcome Subhendu.
Thank you sir ... I request you to upload such a simplified video on machine learning
Thanks Atul.
very nice intuitive explanation, the best on youtube!!Guys dont mind the calculations, thats not the point of the tutorial!!
Glad you liked it Ayush,
Thank you for the detailed explanation with simple example :)
Welcome Jagadish.
Amazing video! How did you work out the slope value to be -4?
Thank you so much aman. I don't know why I didn't come across to your videos till now. Keep up the good work. :)
So nice of you Akash. Pls share in Data Science groups if possible.
Awesome video...keep it up bro...loved it..concepts getting clear
Thanks Debasish.
Thank you. Well done
Awesome explanation.
Thanks Ajay, hope u are doing good and staying safe.
super
Thanks Yatin.
I like your explain is very smart
Thanks a lot.
Bhai , maja a gya .
🫡🫡
Best explanation in the short time
Amazing explanation !
Loved your explanation brother..
Very well explained.
Glad it was helpful!
Neat simple explanation
Glad you liked it Malathi.
Really a great lecture . Cleared many doubts.
Thanks Sheikh.
Hi, simply explained, thanks
Welcome :)
very good, thank you very much
Thank you too!
Your channel should have more reach ... I like your detailed explaination
Thanks Shiv. Happy Learning.Tc
Good job
Thanks.
good explanation
Keep watching Anshika :)
Amazing. Thank you soo much for this video . You included everything and its very well explained.
You're very welcome Goundo. Please share the link within data science groups. Thank you.
Amazing.
Thanks a lot :)
Nice Explanation!
Liked Your Simplicity...
thank you so much sir
Welcome Shahriar.
Sir plz post vedioes on deep learning.u do a great job sir. Amazing vedioes sir.
Thanks for your positive feedback. Please share with others as well who could be benefited from such content.
very clear and wonderfully explained. !
Very good video. Clear explanation
Glad you liked it
Thank you, Thank you, Thank you, Thank you, Thank you soooo much!
Thanks a lot..Awesome explanation
💥💥👌👌👌
Thanks Farhan :)
thanks sir ! Very helpful video.
Welcome Arun
thank you so much for making this video you are amazing
Thanks Mandar.
Hi Aman, Thanks for sharing the video and its very clear, I have a small doubt. I got until we calculate value of 'C', after that if we repeat the process by changing 'C' continuously I observe only the value of 'C' is changing and that's what the partial differentiation meant for and my doubt is how do we understand where to stop the process or how do we understand which is the optimal value of 'C' which will minimize the cost function?
Hi Mohammed, that is a very nice question. "when to stop" is decided on multiple things. 1st is programmer gives number of iteration for example - stop after 1000 iterations. Second is when the gradient approach closest to zero. There are more ways to tell to machine "when to stop" iterating further. Pasting some links for your reference:
math.stackexchange.com/questions/235922/what-stopping-criteria-to-use-in-projected-gradient-descent
stats.stackexchange.com/questions/33136/how-to-define-the-termination-condition-for-gradient-descent
Kindly explain the other optimizers also. I'm having difficulty with Adam, RMSprop, adadelta etc.
I will definitely do that in upcoming video. Happy Learning. Tc
sir a video on gradient checking plzzz btw amazing explanation plz keep it up
Sure.
Thanks man. Keep making such deep explanation videos related to ML. ♥️
My pleasure Yogender. Your comments are my motivation.
Absolutely amazed by your simplified teaching Sir ! Great going
Thanks Anu.
in 11:29 you are saying y - (mx +c) whole thing squre but the real one for cost function is (mx+c)**2 - y this is the correct one I think
very well explained thanks🙄🙄🙏🙏
Brilliant. Thank you
Welcome.
Awesome explanation Sir!! Great work!!
Thank you. You make me understand this.
Beautifully explained. Keep up the good work
Thank you James, I will
I like the way you explain sir.
Thanks a lot Rahul :)
Correct me if I am wrong, it seems you missed divided by n for the cost function. CHECK VIDEO 9:25 minute. overall, one of good videos ever came across.
Thanks for letting me know Parth. Will Check.
Great explanation. Thank you for this.
I don't know if i have bitten more than i could chew by deciding learn machine learning, this gradient decent is giving me a hard time understanding. i am learning it on coursera same issue, i will keep reading hopefully i get to understand it one day.
You can do it!
keep it up sir, good explanation
Keep watching. Thanks a lot.
On derivation step w.r.t C & M, you took -2, why it's -2?
Bcz as per derivative formula x^2 becomes 2x(no negitive sign here)
There is slight mistake for calculation of partial derivative. I pinned comment from a user and that should be taken as reference . Thanks for your feedback :)
When computing the partial derivative where did you get the negative 2 from the exponent 2 is positive how is it that it is negative when differentiating?
we have discussed this in pinned comment.
The Best Video. 😀
Thanks Bhavya.
Well explained sir. Thank you
You are welcome
really help full !!!!!!
Thanks Krishna.
Very nice explanation.
Thanks
Thanks you brother!
Welcome Mihir
thank you so much !!!! you are a great teacher
You're very welcome! pls share with friends
Your explanation is awsm
I already know Gradient Descent, but still going to watch the whole video for some new insights of GD.
Thanks for watching. Stay Safe. tc
Good explanation onto the point 😄
Thank you