Tutorial 26- Linear Regression Indepth Maths Intuition- Data Science

Поділитися
Вставка
  • Опубліковано 25 лис 2024

КОМЕНТАРІ • 325

  • @nandinibalyapally3388
    @nandinibalyapally3388 4 роки тому +94

    I never understood what is a gradient descent and a cost function is until I watch this video 🙏🙏

  • @mohitpatel7876
    @mohitpatel7876 4 роки тому +33

    Best explanation of cost function, we learned it as masters students and the course couldnt explain it as well.. simply brilliant

  • @navjotsingh8372
    @navjotsingh8372 2 роки тому +3

    I have seen many teachers explaining the same concept, but your explainations are next level. Best teacher.

  • @anuragmukherjee1878
    @anuragmukherjee1878 2 роки тому +31

    For those who are confused.
    The convergence derivative will be dJ/dm.

    • @tusharikajoshi8410
      @tusharikajoshi8410 Рік тому

      what's J in this? Y values? I'm super confused about this d/dm of m, cz it would be just 1. and m I think is just total number of values. Shouldn't the slope be d/dx of y?

    • @mdmynuddin1888
      @mdmynuddin1888 Рік тому

      @@tusharikajoshi8410 it will be the cost or loss (J)

    • @mdmynuddin1888
      @mdmynuddin1888 Рік тому +2

      new(m) = m- d(loss or cost)/dm * Alpha(learning rate.

    • @suhasiyer7317
      @suhasiyer7317 Рік тому

      Super helpful

    • @threads25
      @threads25 Рік тому

      I'dont think because it netwons method actually

  • @soumikdutta77
    @soumikdutta77 2 роки тому +4

    Why am I not surprised with such a lucid and amazing explanation of cost function, gradient descent,Global minima, learning rate ...may be because watching you making complex things seems easy and normal has been one of my habit. Thank you SIR

  • @PritishMishra
    @PritishMishra 4 роки тому +1

    I knew that their will be an Indian that can make all the stuffs easy !! Thanks Krish

  • @pjanjanam
    @pjanjanam 3 роки тому +1

    A small comment at 17:35. I guess it is Derivative of J(m) over m. In other words, the rate of change of J(m) over a minute change of m. That gives us the slope at instantaneous points, especially for non linear curves when slope is not constant. At each point of "m, J(m)", Gradient descent travels in the opposite direction of slope to find the Global minima, with the smaller learning rate. Please correct me if I am missing something.
    Thanks for a wonderful video on this concept @Krish, your videos are very helpful to understand the Math intuition behind the concepts, I am a super beneficiary of your videos, Huge respect!!.

  • @dhainik.suthar
    @dhainik.suthar 3 роки тому +4

    This maths is same as coursera machine learning courses
    Thank you sir for this great content ..

  • @shubhamkohli2535
    @shubhamkohli2535 4 роки тому

    Really awesome video , so much better than many famous online portals charging huge amount of money to teach things.

  • @mayureshgawai5951
    @mayureshgawai5951 3 роки тому

    No one can find easiest explanation of gradient descent on youtube. This video is the exception.

  • @ayurdubey4818
    @ayurdubey4818 2 роки тому +9

    The video was really great. But I would like to point out that the derivative that you took for convergence theorem, there instead of (dm/dm) it should be derivative of cost function with respect to m . Also a little suggestion at the end it would have been helpful, if you mentioned what m was, total number of points or the slope of the best fit line. Apart from this the video helped me a lot hope you add a text somewhere in this video to help the others.

  • @padduchennamsetti6516
    @padduchennamsetti6516 3 місяці тому

    you just made the whole concept clear with this video,you are a great teacher

  • @tarunsingh-yj9lz
    @tarunsingh-yj9lz Рік тому

    Best video on youtube to understand the intution and math(surface level) behind Linear regression.
    Thank you for such great content

  • @akrsrivastava
    @akrsrivastava 4 роки тому +29

    Hi Krish, Thanks for the video. Some queries/clarifications required:
    1. We do not take gradient of m wrt m. That will always be 1. We take the gradient of J wrt m
    2. If we have already calculated the cost function J at multiple values of m, then why do we need to do gradient descent because we already know the m where J is minimum
    3. So we start with an m , calculate grad(J) at that point and update m with m' = m - grad(J)* learn_rate and repeat till we reach some convergence criteria
    Please let me know if my understanding is correct.

    • @slowhanduchiha
      @slowhanduchiha 4 роки тому

      Yes this is correct

    • @vamsikrishna4107
      @vamsikrishna4107 4 роки тому

      I think we have to train the model to reach that min. loss point while performing grad. descent in real life problems.

    • @shreyasbs2861
      @shreyasbs2861 3 роки тому

      How to find best Y intercept ?

  • @animeshkoley6478
    @animeshkoley6478 3 роки тому +2

    Best explanation of Linear Regression🙏🙏🙏.Simply wow🔥🔥

  • @nanditagautam6310
    @nanditagautam6310 3 роки тому +1

    This is the best stuff i ever came across on this topic !

  • @manikaransingh3234
    @manikaransingh3234 4 роки тому +34

    I don't see a link on the top right corner for the implementation as you said in the end.

  • @V2traveller
    @V2traveller 4 роки тому

    every line you speak..so much important to understand ths concept......thank u

  • @priyanshusharma2516
    @priyanshusharma2516 3 роки тому +4

    Watched this video 3 times back to back .Now its embaded in my mind forever. Thanks Krish , great explanation !!

  • @supervickeyy1521
    @supervickeyy1521 4 роки тому

    i knew the concept of Linear Regression but didn't know the logic behind it.. the way Line of Regression is chosen. Thanks for this!

  • @azizahmad1344
    @azizahmad1344 3 роки тому +2

    Such a great explanation of gradient descent and convergence theorem.

  • @python_by_abhishek
    @python_by_abhishek 3 роки тому +7

    Before watching this video I was struggling with the concepts exactly like you were struggling in plotting the gradient descent curve. ☺️Thanks for explaining this beautifully.

  • @RJ-dz6ie
    @RJ-dz6ie 4 роки тому +4

    How can I not say that you are amazing !! I was struggling to understand the importance of gradient descent and u cleared it to me in the simplest way possible.. Thank you so much sir :)

  • @moulisiramdasu6753
    @moulisiramdasu6753 4 роки тому

    Really thanks you krish.
    you just cleared my doubts on cost function and gradient descent. First I saw Andrew Ng class but have few doubts after seeing you video. Now its crystal clear..
    Thank You...

  • @varungupta2727
    @varungupta2727 5 років тому +43

    Similar to Andrew NG course from coursera kind of revision for me 😊😊

    • @Gayathri-jo4ho
      @Gayathri-jo4ho 4 роки тому

      Can you please suggest me how to begin with in order to learn machine learning

    • @Gayathri-jo4ho
      @Gayathri-jo4ho 4 роки тому

      @@ArpitDhamija did you have knowledge on machine learning??if so, please suggest me I saw so many but I couldnt able to .

    • @shhivram929
      @shhivram929 4 роки тому +2

      @@Gayathri-jo4ho This playlist itself is a fantastic place to start, Or can enroll in this course "Machine Learning A-Z by krill eremenkrov" in udemy. The course will give you an intuitive understanding of the ML Algorithms. Then it's up to you to research and study the math behind each concept..Reff (kgnuggets, Medium, MachineLearningplus and lot more)

    • @Gayathri-jo4ho
      @Gayathri-jo4ho 4 роки тому

      @@shhivram929 thank you

    • @sarithajaligama9548
      @sarithajaligama9548 4 роки тому

      Exactly. This is the equivalent of Andrew Ng's description

  • @SaroashRahil
    @SaroashRahil 9 місяців тому

    the only video that made gradient descent so simple that even 2nd grade students woud understand

  • @anuragbhatt6178
    @anuragbhatt6178 4 роки тому +1

    The best I've come across on gradient descent and convergence theorem

  • @chimadivine7715
    @chimadivine7715 2 місяці тому

    Now I understand what GD means. Thanks always, Krish

  • @Karthik-s4y5f
    @Karthik-s4y5f Рік тому

    Finally I understood the perfect answer of gradient descent..

  • @annapurnaparida7655
    @annapurnaparida7655 3 роки тому

    So beautifully explained...did not find anywhere this kind of clarity....keepnup the good work....

  • @Tales.of.Irshad
    @Tales.of.Irshad 4 роки тому

    I feel so sad for him... because only aspired Data science is gonna watch this video so he will have fewer subscribers that are not even comparable with what he is giving... Really hats of you sir,. I have taken 2 online paid class but I don't think they are better thank you, Never.

  • @skviknesh
    @skviknesh 3 роки тому +1

    Great! Fantastic! Fantabulous! tasting the satisfaction of learning completely - only in your videos!!!!!

  • @kevinsusan3345
    @kevinsusan3345 4 роки тому +2

    I had so much difficulty in understanding gradient descent but after this video
    It's perfectly clear

  • @meetbardoliya6645
    @meetbardoliya6645 2 роки тому

    Value of the video is just undefinable! Thanks a lot :)

  • @9902152322
    @9902152322 2 роки тому

    god bless you too sir, explained very well. basics helps to grow high level understanding

  • @arunsundar489
    @arunsundar489 4 роки тому +6

    Please add the indepth math intution of other algorithms like logistic, random forest, support vector and ANN.. Many Thanks for the clearly explained abt linear regression

  • @pradeepmallampalli6510
    @pradeepmallampalli6510 3 роки тому

    Thank you Soo much Krish. No where I could find such a detailed explanation
    You made my Day!

  • @ahmedbouchou6893
    @ahmedbouchou6893 5 років тому +3

    Hi . Can you please do a video about the architecture of machine learning systems in real world . How does really work in real life .for example how hadop (pig,hive) , spark, flask , Cassandra , tableau are all integrated to create a machine learning architecture. Like an e2e

  • @w3r161
    @w3r161 8 місяців тому

    Thank you my friend, you are a great teacher!

  • @vishnuppriya5263
    @vishnuppriya5263 Рік тому

    Really great sir. I very much thank you sir for this clear explanation

  • @Neuraldata
    @Neuraldata 4 роки тому +1

    We would also recommend your videos to our students!

  • @PankajMishra-ey3yh
    @PankajMishra-ey3yh 3 роки тому +2

    I think in the Convergence theorem part, the derivative should be d(J(m))/d(m), as in a y-x graph, we take derivative of y wrt x. Here our Y is J(m) and X is m.

  • @nurali2525
    @nurali2525 3 роки тому

    This guy was born to teach

  • @ngarwailau2665
    @ngarwailau2665 2 роки тому

    Your explanations are the clearest!!!

  • @AjayKumar-id7mb
    @AjayKumar-id7mb 3 роки тому

    After watching this 3 times everything is clear
    Repetition is the key

  • @dhruv1324
    @dhruv1324 Рік тому

    never found a better explaination

  • @FaizanKhan-fn6ew
    @FaizanKhan-fn6ew 5 років тому +1

    Thanq so much for all your efforts.... Knowledge, rate of speech and ability to make thing easy are nicest skill that you hold...

  • @shailesh1981able
    @shailesh1981able 2 роки тому

    Awesome!! Cleared all doubts seeing this video! Thanks alot Mr. Krish for creating indepth content on such subject!

  • @nidhimehta9278
    @nidhimehta9278 3 роки тому

    Best video on theory of linear regression! Thankyou soo much Krish!

  • @jaisamdariya4307
    @jaisamdariya4307 3 роки тому

    I wish I could like this thousand times.

  • @123man123man1
    @123man123man1 11 місяців тому

    Thank you for sharing this insightful video about linear regression. While I found it informative, I'm uncertain about how it addresses the challenge of avoiding local minima. I'd greatly appreciate it if you could provide some insights on this aspect as well.

  • @pranitaumarji5224
    @pranitaumarji5224 5 років тому +4

    Thankyou for this awesome explanation!

  • @cutecreature_san
    @cutecreature_san 3 роки тому

    your videos are clear and easy to understand

  • @tezzbhandari3725
    @tezzbhandari3725 2 роки тому

    The graph of the cost function is not gradient descent. The automatic differentiation of cost function with respect to m is gradient decent which is used to update the m.

  • @rambaldotra2221
    @rambaldotra2221 3 роки тому

    Thank You Sir, You have explained everything about gradient Descent in the best possible easiest way !!

  • @kushshri05
    @kushshri05 5 років тому +3

    Plz try to upload videos on this series in span of 2 days...

  • @arhaangarg1482
    @arhaangarg1482 3 роки тому

    couldnt undertsand when andrew Ng was teaching but you bro !!!

  • @koushikkumar4938
    @koushikkumar4938 3 роки тому +6

    Implementation part:
    Multiple linear Regression - ua-cam.com/video/5rvnlZWzox8/v-deo.html
    Simple linear Regression - ua-cam.com/video/E-xp-SjfOSY/v-deo.html

  • @karthiavenger4577
    @karthiavenger4577 4 роки тому

    Yaar you nailed it man after watching sooo many videos i had some Idea , By Finishing your Video now i m completely clear 😍😍😍😍

  • @arrooow9019
    @arrooow9019 3 роки тому

    Oh my gosh this is awesome tutorial I ever seen God bless you sir🤩🤩

  • @aayushsuman4592
    @aayushsuman4592 8 місяців тому

    Thank you so much, Krish!

  • @Cricketnews-ek5fy
    @Cricketnews-ek5fy 3 роки тому

    In 22:50 time sir said when it reaches to global minima the slope value will be 0 and And the value of m will be considered for best fit line but the value of slope and m is same.Please clear doubt @krishan Naik sir

  • @aritra8820
    @aritra8820 2 роки тому

    when you are writing convergence theorm it should be m - d(j(m))/dm * alpha

  • @shchiranth6626
    @shchiranth6626 3 роки тому

    Great Tut sir got things pretty quick with this video ty

  • @sarithajaligama9548
    @sarithajaligama9548 4 роки тому

    This is the equivalent of Andrew Ng's description. But I never understood this concept until watching this video.

  • @RanjithKumar-jo7xf
    @RanjithKumar-jo7xf 2 роки тому

    Nice Explanation, I like this.

  • @avinashgote2770
    @avinashgote2770 Рік тому

    good expplanation now clear all queries

  • @auroshisray9140
    @auroshisray9140 3 роки тому

    Thank you Krish bhaiya!

  • @shaiksuleman3191
    @shaiksuleman3191 3 роки тому

    Sir No Words to explain simply super b

  • @nivitus9037
    @nivitus9037 5 років тому +2

    Great...

  • @sanjug7317
    @sanjug7317 3 роки тому

    Very good and detailed explanation

  • @SanjeevKumar-dr6qj
    @SanjeevKumar-dr6qj Рік тому

    Great sir. Love this video

  • @mellowftw
    @mellowftw 3 роки тому

    Thanks so much sir.. you're doing good for the community

  • @priyankachoubey4570
    @priyankachoubey4570 3 роки тому

    As always Krish very well explained!!

  • @akshaychauhan5919
    @akshaychauhan5919 3 роки тому

    It should be derivative of J(m) w.r.t. m which will give slope of J vs m curve

  • @nikifoxy69
    @nikifoxy69 4 роки тому

    Loved it. Thanks Krish.

  • @shhivram929
    @shhivram929 4 роки тому +3

    Hi krish, that was an awesome explanation of Gradient Descent. With respect to finding the optimal slope.
    But in linear regression both slope and the intercept are tweakable parameters, how do we achive the optimal intercept value in linear regression.

  • @juozapasjurksa1400
    @juozapasjurksa1400 3 роки тому

    Thank you! This video was so good!

  • @Dinesh-uh4gw
    @Dinesh-uh4gw 3 роки тому

    Excellent Explanation

  • @ShiVa-jy5ly
    @ShiVa-jy5ly 4 роки тому

    Thankyou sir...Get to learn so much from you.

  • @SachinPatil-om1ms
    @SachinPatil-om1ms 3 роки тому

    Finally... I got to know how it works 👍

  • @pradnyavk9673
    @pradnyavk9673 2 роки тому

    very well explained Thank you.

  • @jamesrobisnon9165
    @jamesrobisnon9165 3 роки тому +1

    Dear Krish: At 14:42' you mention that curve is called gradient descent. I believe this is not true. Gradient descent is not the name of that curve. Gradient descent is an optimization algorithm.

  • @shrikantlandage7305
    @shrikantlandage7305 4 роки тому

    my god that was clear as crystal...thanks krish

  • @abhisheks.2553
    @abhisheks.2553 4 роки тому

    sir please elaborate this topic more, like please add what are the assumptions of linear regression
    what are the conditions need to be satisfied to apply linear regression.
    its my humble request to you sir . please i am able to understand those topic better which you teach so sir its my request to you.

  • @sagarparigi1884
    @sagarparigi1884 3 роки тому

    This video is really helpful.

  • @TimeSense368
    @TimeSense368 4 роки тому

    I generally dont comment . But you are like angle for students like me who hate maths but love Program ming

  • @dsc40sundar18
    @dsc40sundar18 Рік тому +1

    H i sir great content and a big fan of your work let me ask a doubt in cost function many books or blogs takes the cost function as 1/NSUMATION( Y - Y^) BUT you used 1/2N SUMATION( Y - Y^) so i was bit confused in that part and tq u for wonderful content thnak you so much sir

  • @vishwashah4109
    @vishwashah4109 3 роки тому

    Best explanation. Thank you!

  • @rezafarrokhi9871
    @rezafarrokhi9871 3 роки тому +5

    Thanks for all great prepared videos, I think you meant (deriv.J(m) / deriv(m)) at 17'.45", is it correct?

  • @jaspreetsingh5334
    @jaspreetsingh5334 3 роки тому

    Thanks Krish u are helping alot

  • @tamellasivasubrahmanyam6683
    @tamellasivasubrahmanyam6683 4 роки тому

    you are ultimate, got answers to some many questions, video is good.

  • @karanjaiswal4682
    @karanjaiswal4682 3 роки тому

    also need to note that the gradient descent should not be taken very very small as it will take a very long time to reach the global minimum

  • @debrupdey7948
    @debrupdey7948 Рік тому

    great video sir, so lucid

  • @AVyt28
    @AVyt28 4 роки тому

    Great video I understood the concept

  • @yashodhansatellite1
    @yashodhansatellite1 5 років тому +2

    Hats off

  • @AnjaliSingh-oe1mo
    @AnjaliSingh-oe1mo 2 роки тому

    Please always link the previous videos to help go through the topics in sequence

  • @mvcutube
    @mvcutube 3 роки тому

    Nice tutorial. Thank you

  • @wellwhatdoyakno6251
    @wellwhatdoyakno6251 2 роки тому

    lovely! love it.

  • @akshaygupta6321
    @akshaygupta6321 5 років тому +22

    In a single sentence "You're best"