Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm

Поділитися
Вставка
  • Опубліковано 5 жов 2024
  • Gradient descent simple explanation|gradient descent machine learning|gradient descent algorithm
    #gradientdescent #unfolddatascience
    Hello All,
    My name is Aman and I am a data scientist. In this video I explain gradient descent piece by piece. In this video, my intention is to make gradient descent extremely simple to understand. Gradient descent being a very important algorithm for machine learning and deep learning is a must know topic for every data scientist. Below questions are answered in this video:
    1. What is gradient descent?
    2. How gradient descent works?
    3. Gradient descent algorithm?
    4. What is gradient descent in machine learning?
    5. What is gradient descent in deep learning?
    6. How gradient descent algorithm works?
    About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
    Join Facebook group :
    www.facebook.c...
    Follow on medium : / amanrai77
    Follow on quora: www.quora.com/...
    Follow on twitter : @unfoldds
    Get connected on LinkedIn : / aman-kumar-b4881440
    Follow on Instagram : unfolddatascience
    Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
    Watch python for data science playlist here:
    • Python Basics For Data...
    Watch statistics and mathematics playlist here :
    • Measures of Central Te...
    Watch End to End Implementation of a simple machine learning model in Python here:
    • How Does Machine Learn...
    Learn Ensemble Model, Bagging and Boosting here:
    • Introduction to Ensemb...
    Access all my codes here:
    drive.google.c...
    Have question for me? Ask me here : docs.google.co...
    My Music: www.bensound.c...

КОМЕНТАРІ • 482

  • @DS_AIML
    @DS_AIML 4 роки тому +34

    My question is when we calculate Partial derivative with respect to 'c' and 'm' ,we should consider one as constant.For example
    to calculate partial derivative of cost function J with respect to c ∂J/∂c ,we should consider 'm' as constant .So the above calculation should be like this. -2[2 - (c+m)] + (-2)[4-(c+3m)] => -2[2-(c)]+(-2)[4-(c)] => -2[2] -2[4] =>-4-8=> -12.
    Please confirm

    • @diobrando1253
      @diobrando1253 4 роки тому +1

      Yep when we calculate w.r.t c m is const and vice versa.

    • @brajeshanand
      @brajeshanand 3 роки тому +10

      Hi Anjani...why it is -2[2-(c+m)] as derivative od [2-(c+m.1)]^2 . dont you think it should be 2[2-(c+m)] from the derivation rule.

    • @soumyapatil4991
      @soumyapatil4991 3 роки тому +16

      why -2 i still didnt get it... it should be ... 2[2-(c)]+2[4-(c)] right?

    • @manisharvinds895
      @manisharvinds895 3 роки тому +1

      Could you please elaborate on your derivative method @Anjani Kumar . I guess the value -4 in the video is correct

    • @sajjadabdulmalik4265
      @sajjadabdulmalik4265 3 роки тому +1

      Why -2?

  • @pankajgoikar4158
    @pankajgoikar4158 11 місяців тому +5

    This is the first time I'm learning about Gredient Desent, and I understood how algorithms work. This video is amazing. Thank you so much.

  • @brijkishortiwari2077
    @brijkishortiwari2077 4 роки тому +73

    Hi Aman sir, i am a PhD scholar (almost completed) from IIT Madras and since last few months just for my interest i was exploring DS, ML and DL (though i am not from CS background) and landed with some of your videos on UDS which really increase my curiosity to learn more about it. Though i have explored a lot online videos and many other sources on ML including corsera etc. but i can say that your explanations are extremely good for conceptually to understand the subject. Just i am not able to control myself without appreciating your great effort and you are doing really a great job/help for the aspirants of ML/DS. Thank a lot for all what you are doing.

  • @aparnasingh4096
    @aparnasingh4096 3 роки тому +26

    Went through lots of articles but didn't understand the core. But your video made it clear within 15 minutes :) Just awesome keep up the good work :)

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому +2

      Thanks Aparna, your comments are very valuable for me.

  • @143balug
    @143balug 4 роки тому +11

    This is one of the best explanation video about Gradient Descent, I like your detailed explaination. Looking forward for more videos on various Optimizers.
    Thank you

    • @UnfoldDataScience
      @UnfoldDataScience  4 роки тому

      Thanks a lot Bala. Yes will create on those topics as well.

  • @thaitran7272
    @thaitran7272 4 місяці тому +1

    I like how you simply talk and explain .Especially, your speed is enough for everyone to understand such a tough concept like gradient descent.

  • @fatriantobong
    @fatriantobong 2 місяці тому

    THE best explanation so far, short consice and precise, Algorithm, minimimizing loss function by having the optimal parameters

  • @melihulugyldz9861
    @melihulugyldz9861 Рік тому +3

    the info you have given at arount 7:00 was very insightful. it shows why gradient always points in the direction of steepest ascent. thank you

  • @VaradGheware-g2d
    @VaradGheware-g2d 7 днів тому

    I was confused in the gradient raw concept Your video helped me to understand that
    thanks for such an informative video

  • @ManiKandan-lx9zg
    @ManiKandan-lx9zg 3 роки тому +2

    Great great lecture for gradient descent I have ever seen....thank u so much for sharing ur knowledge sir ❤️

  • @T1s_kashyap
    @T1s_kashyap 7 місяців тому +2

    Why do I understood everything... Thankyou so much sir ❤️❤️❤️❤️❤️.

  • @paonsgraphics9750
    @paonsgraphics9750 2 роки тому +2

    You're just amazing! Anyone can understand gradient descend by watching this video. Thanks!

  • @rdcreations4u209
    @rdcreations4u209 3 роки тому +5

    Sir my suggestion is that .. explain all topics as per JNTUH btech syllabus ..U will get so many views for sure👍 ur explanation is extraordinary

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому +3

      Thank you for motivation. :), I will check with university 🤣

  • @aaronlopes5256
    @aaronlopes5256 3 роки тому +7

    New value=old value-learning rate * slope
    this made me understand the whole concept within seconds.
    Thank you Sir!

  • @charlottenouwen8553
    @charlottenouwen8553 Рік тому

    Your UA-cam channel has been extremely helpful in preparing for job interviews. I just landed a data science job at McKinsey. Thank you!

  • @chriseyebagha9454
    @chriseyebagha9454 3 роки тому +4

    This is the best explanation that I have seen on Gradient descent, i mean, I already had a great idea of what it was but you took me to a different level. Thank you!

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому

      Glad it was helpful Chris :)

    • @AlgoTribes
      @AlgoTribes 3 роки тому

      couldn't agree more..😂😂

    • @AlgoTribes
      @AlgoTribes 3 роки тому

      I was recently impressed by the deeplizard explaining these concepts....now these content makes me feel more safer than i could have imagined..😀😀

  • @madial382
    @madial382 11 днів тому

    u r wonderful teacher i have ever seen

  • @ansh6848
    @ansh6848 3 роки тому +1

    Wow!Awesome Explanation I was not getting it after studying it 3-4 days but after watching your video it seems pretty easy to me

  • @himanshirawat1806
    @himanshirawat1806 2 роки тому

    Simply wow. After a month I understood today Gradient Descent. Thank you soo much for the video 😊

  • @raj345to
    @raj345to 3 роки тому +1

    bhai ji itne pyar se padhaya hai maja agaya!!!

  • @DURGASREECHOWDHARYKOMMINI
    @DURGASREECHOWDHARYKOMMINI 9 місяців тому

    I have gone through lots of explanations and it was not understood. But through this video, i got the confidence in continue my learning forward. supeerr sir, thank you

  • @gayathrigirishnair7405
    @gayathrigirishnair7405 2 роки тому +1

    This finally helped me to understand Gradient Descent. Thank You.

  • @bahkeith7357
    @bahkeith7357 2 роки тому

    i don't know what this world will be without indians youtubers. thank you very much at least i got something

  • @0SIGMA
    @0SIGMA 3 роки тому +2

    Damnn... I don't know how far your knowledge extends but I am finding very easy to understand with fun real world example. I am your new subscriber. 🎉❤️

  • @elcheapo9444
    @elcheapo9444 Рік тому +1

    Your teaching method is masterful! None of the books I read go to such depths. Thanks!!

  • @GayatriDuwarah
    @GayatriDuwarah 3 роки тому +1

    Well explained tomorrow I have my project

  • @suresh9031
    @suresh9031 5 місяців тому +1

    great ... simple and best. Thank you.

  • @surajitlahiri
    @surajitlahiri 8 місяців тому

    One of the best explanation of gradient descent. Thank you so much. Very informative

  • @UsmanKhan-tc4sk
    @UsmanKhan-tc4sk 3 роки тому +1

    very beneficial video thnk you so much love form pakistan

  • @omnnnooy3267
    @omnnnooy3267 Місяць тому

    The first time I understand this concept, thank you so much!!!

  • @drbobmathsonlinemathstuiti5226
    @drbobmathsonlinemathstuiti5226 3 роки тому +1

    Thank you for a clear and detailed explanation of this topic

  • @evanshareef2397
    @evanshareef2397 3 роки тому +1

    Excellent !!! I have liked the video very much!!! You taught and asked question exactly what I was looking for! And I was looking for such explanation. Go ahead,sir.

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому +1

      You are welcome Evan. Your comments are my motivation.

  • @farcim
    @farcim Рік тому +2

    Great explanation

  • @sayansen11
    @sayansen11 4 роки тому +1

    Awesome ... very simple explanation

  • @SaidElnaffar
    @SaidElnaffar 3 роки тому +3

    You are a real master -- Liked, Subscribed, Shared!

  • @yoyomovieclips8813
    @yoyomovieclips8813 4 роки тому +1

    Great work sir,i finally understood from your video.Thanks a lot

  • @mohinimarathe8769
    @mohinimarathe8769 3 роки тому

    You are just under-rated aman after krish and joshua You are the one who students need to follow you in Data Science Community...srsly

  • @buragohainmadhurima
    @buragohainmadhurima 3 роки тому +1

    Awesome video. This is the best explanation. Please make more videos.

  • @AhmadSultan-yz6ek
    @AhmadSultan-yz6ek 3 роки тому

    This is just amazing.... simple explanation of a complex thing.

  • @hitechsubho1
    @hitechsubho1 3 роки тому

    Thank you for explaining it in simple language. Wonderful.

  • @Atuleric
    @Atuleric 3 роки тому +1

    Thank you sir ... I request you to upload such a simplified video on machine learning

  • @ayush9psycho
    @ayush9psycho 3 роки тому

    very nice intuitive explanation, the best on youtube!!Guys dont mind the calculations, thats not the point of the tutorial!!

  • @jagadisheslavath4578
    @jagadisheslavath4578 3 роки тому +1

    Thank you for the detailed explanation with simple example :)

  • @Elementiah
    @Elementiah 4 місяці тому +1

    Amazing video! How did you work out the slope value to be -4?

  • @akashmanojchoudhary3290
    @akashmanojchoudhary3290 3 роки тому

    Thank you so much aman. I don't know why I didn't come across to your videos till now. Keep up the good work. :)

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому

      So nice of you Akash. Pls share in Data Science groups if possible.

  • @cryptoin09
    @cryptoin09 3 роки тому +1

    Awesome video...keep it up bro...loved it..concepts getting clear

  • @soundaravallidevarajan335
    @soundaravallidevarajan335 3 дні тому

    Thank you. Well done

  • @ajaykushwaha4233
    @ajaykushwaha4233 3 роки тому +1

    Awesome explanation.

  • @yatinarora9650
    @yatinarora9650 2 роки тому +1

    super

  • @gnanaselvannallathambi9019
    @gnanaselvannallathambi9019 3 роки тому

    I like your explain is very smart

  • @SHIVAMGUPTA-wb5mw
    @SHIVAMGUPTA-wb5mw Місяць тому

    Bhai , maja a gya .
    🫡🫡

  • @jayendramanikumar9211
    @jayendramanikumar9211 Рік тому

    Best explanation in the short time

  • @Shailendrakumar-cv4yv
    @Shailendrakumar-cv4yv 24 дні тому

    Amazing explanation !

  • @mytalents260
    @mytalents260 3 місяці тому

    Loved your explanation brother..

  • @mayankdeep433
    @mayankdeep433 Рік тому

    Very well explained.

  • @malathip8986
    @malathip8986 4 роки тому

    Neat simple explanation

  • @sheikhshah2593
    @sheikhshah2593 3 роки тому +1

    Really a great lecture . Cleared many doubts.

  • @vcjayan8206
    @vcjayan8206 3 роки тому +1

    Hi, simply explained, thanks

  • @anansipotpourri
    @anansipotpourri 2 роки тому

    very good, thank you very much

  • @shivki23
    @shivki23 4 роки тому +1

    Your channel should have more reach ... I like your detailed explaination

  • @edeabgetachew6054
    @edeabgetachew6054 2 роки тому

    Good job

  • @anshikagupta3176
    @anshikagupta3176 3 роки тому

    good explanation

  • @goundosidibe9964
    @goundosidibe9964 3 роки тому

    Amazing. Thank you soo much for this video . You included everything and its very well explained.

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому

      You're very welcome Goundo. Please share the link within data science groups. Thank you.

  • @preranatiwary7690
    @preranatiwary7690 4 роки тому +1

    Amazing.

  • @wathson_official
    @wathson_official 10 місяців тому

    Nice Explanation!
    Liked Your Simplicity...

  • @shahriaralom4547
    @shahriaralom4547 3 роки тому +1

    thank you so much sir

  • @RAJI11000
    @RAJI11000 2 роки тому

    Sir plz post vedioes on deep learning.u do a great job sir. Amazing vedioes sir.

    • @UnfoldDataScience
      @UnfoldDataScience  2 роки тому

      Thanks for your positive feedback. Please share with others as well who could be benefited from such content.

  • @sherlinchand2277
    @sherlinchand2277 2 роки тому

    very clear and wonderfully explained. !

  • @raghuprasadkonandur-ekashalya
    @raghuprasadkonandur-ekashalya 4 роки тому

    Very good video. Clear explanation

  • @prateekdahiya3991
    @prateekdahiya3991 Рік тому

    Thank you, Thank you, Thank you, Thank you, Thank you soooo much!

  • @FarhanAhmed-xq3zx
    @FarhanAhmed-xq3zx 3 роки тому +1

    Thanks a lot..Awesome explanation
    💥💥👌👌👌

  • @arunmehta8234
    @arunmehta8234 3 роки тому +1

    thanks sir ! Very helpful video.

  • @mandarmore.9635
    @mandarmore.9635 Рік тому +1

    thank you so much for making this video you are amazing

  • @mohammedalishaik4748
    @mohammedalishaik4748 4 роки тому +3

    Hi Aman, Thanks for sharing the video and its very clear, I have a small doubt. I got until we calculate value of 'C', after that if we repeat the process by changing 'C' continuously I observe only the value of 'C' is changing and that's what the partial differentiation meant for and my doubt is how do we understand where to stop the process or how do we understand which is the optimal value of 'C' which will minimize the cost function?

    • @UnfoldDataScience
      @UnfoldDataScience  4 роки тому +2

      Hi Mohammed, that is a very nice question. "when to stop" is decided on multiple things. 1st is programmer gives number of iteration for example - stop after 1000 iterations. Second is when the gradient approach closest to zero. There are more ways to tell to machine "when to stop" iterating further. Pasting some links for your reference:
      math.stackexchange.com/questions/235922/what-stopping-criteria-to-use-in-projected-gradient-descent
      stats.stackexchange.com/questions/33136/how-to-define-the-termination-condition-for-gradient-descent

  • @ShaidaMuhammad
    @ShaidaMuhammad 4 роки тому +1

    Kindly explain the other optimizers also. I'm having difficulty with Adam, RMSprop, adadelta etc.

    • @UnfoldDataScience
      @UnfoldDataScience  4 роки тому +1

      I will definitely do that in upcoming video. Happy Learning. Tc

  • @thisiswhy793
    @thisiswhy793 3 роки тому +1

    sir a video on gradient checking plzzz btw amazing explanation plz keep it up

  • @yogenderkushwaha5523
    @yogenderkushwaha5523 4 роки тому +1

    Thanks man. Keep making such deep explanation videos related to ML. ♥️

    • @UnfoldDataScience
      @UnfoldDataScience  4 роки тому +1

      My pleasure Yogender. Your comments are my motivation.

  • @anuradhabalasubramanian9845
    @anuradhabalasubramanian9845 2 роки тому

    Absolutely amazed by your simplified teaching Sir ! Great going

  • @adsgcreation1581
    @adsgcreation1581 2 місяці тому

    in 11:29 you are saying y - (mx +c) whole thing squre but the real one for cost function is (mx+c)**2 - y this is the correct one I think

  • @yashsinghal7471
    @yashsinghal7471 Рік тому

    very well explained thanks🙄🙄🙏🙏

  • @dr.himanimittal9880
    @dr.himanimittal9880 3 роки тому +1

    Brilliant. Thank you

  • @sukhleenkaur7061
    @sukhleenkaur7061 6 місяців тому

    Awesome explanation Sir!! Great work!!

  • @samsricatjaidee405
    @samsricatjaidee405 5 місяців тому

    Thank you. You make me understand this.

  • @Mr_Sin99
    @Mr_Sin99 3 роки тому

    Beautifully explained. Keep up the good work

  • @Rahulkumar-vt5jb
    @Rahulkumar-vt5jb 4 роки тому +1

    I like the way you explain sir.

  • @parthshah6463
    @parthshah6463 3 роки тому

    Correct me if I am wrong, it seems you missed divided by n for the cost function. CHECK VIDEO 9:25 minute. overall, one of good videos ever came across.

  • @zakikhurshid3843
    @zakikhurshid3843 Рік тому

    Great explanation. Thank you for this.

  • @chibuzorobiefuna4090
    @chibuzorobiefuna4090 2 роки тому

    I don't know if i have bitten more than i could chew by deciding learn machine learning, this gradient decent is giving me a hard time understanding. i am learning it on coursera same issue, i will keep reading hopefully i get to understand it one day.

  • @csa5129
    @csa5129 4 роки тому

    keep it up sir, good explanation

  • @krishnaprasad9378
    @krishnaprasad9378 3 роки тому +1

    On derivation step w.r.t C & M, you took -2, why it's -2?
    Bcz as per derivative formula x^2 becomes 2x(no negitive sign here)

    • @UnfoldDataScience
      @UnfoldDataScience  3 роки тому

      There is slight mistake for calculation of partial derivative. I pinned comment from a user and that should be taken as reference . Thanks for your feedback :)

  • @omeshamisuanigala4635
    @omeshamisuanigala4635 Рік тому +1

    When computing the partial derivative where did you get the negative 2 from the exponent 2 is positive how is it that it is negative when differentiating?

  • @bhavya2301
    @bhavya2301 3 роки тому +1

    The Best Video. 😀

  • @Person-hb3dv
    @Person-hb3dv 2 роки тому

    Well explained sir. Thank you

  • @krishnamore2281
    @krishnamore2281 3 роки тому +1

    really help full !!!!!!

  • @srinivasrathodkethavath9545
    @srinivasrathodkethavath9545 2 роки тому

    Very nice explanation.

  • @mihirnaik3383
    @mihirnaik3383 2 роки тому +1

    Thanks you brother!

  • @bslnada9248
    @bslnada9248 Рік тому

    thank you so much !!!! you are a great teacher

  • @architasingh5659
    @architasingh5659 5 місяців тому

    Your explanation is awsm

  • @ShaidaMuhammad
    @ShaidaMuhammad 4 роки тому +1

    I already know Gradient Descent, but still going to watch the whole video for some new insights of GD.

  • @BDKMITAnilkumar
    @BDKMITAnilkumar 2 роки тому +1

    Good explanation onto the point 😄