Multiple Linear Regression | Part 2 | Mathematical Formulation From Scratch

Поділитися
Вставка
  • Опубліковано 21 лип 2024
  • Dive into the mathematical foundation of Multiple Linear Regression in this second part of our series. We'll guide you through the formulation from scratch, making it easy to grasp the concepts behind this powerful regression technique. Build a solid foundation for your regression modeling skills.
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ⌚Time Stamps⌚
    00:00 - Intro
    00:30 - Types of Linear Regression
    03:31 - Mathematical Formulation
    26:45 - Detouring for sometime
    43:20 - Why Gradient Descent?

КОМЕНТАРІ • 78

  • @saptarshisanyal6738
    @saptarshisanyal6738 2 роки тому +73

    I dont think mathematical explanation given in this video exist in youtube. I have found this in the book "Mathematics for Machine Learning by Marc Peter Deisenroth". This is Simply brilliant. Although, matrix differentiation part is absent, but still this is extra ordinary stuff.

    • @sidindian1982
      @sidindian1982 Рік тому +1

      yes , Well said .. brilliantly explained :-)😍😍😍😇😇

  • @kashifhabib290
    @kashifhabib290 11 місяців тому +11

    Sir I Can tell You No One Literally No One Video Can Compare through your teaching, I have seen videos of Coding Ninjas and Other Paid Lectures but nobody has got into this dept , I can literally feel Machine Learning in Front of My Imagination. Thank You So Much Sir 🙏

  • @Ganeshkakade454
    @Ganeshkakade454 Рік тому +2

    Hi Sir..U r truly gem person sharing such a great knowledge free..is blessing for new generation..god bless u sir..Aap hamare Guru ho Aaj Se..

  • @karthikmanthitta6362
    @karthikmanthitta6362 2 роки тому +3

    Such wonderful explanation sir, really thanks a lot♥️♥️♥️you were able to explain something which 100s of videos couldn't explain to me.

  • @sudhanshumishra3677
    @sudhanshumishra3677 6 місяців тому

    literally, it was the greatest explanation that I have ever seen on UA-cam. Hat's off Sir

  • @ranirathore4176
    @ranirathore4176 Рік тому +4

    Most underrated channel on utube 🥲

  • @core4032
    @core4032 2 роки тому +1

    step by step series in very detail , superb .

  • @TheAparajit
    @TheAparajit 10 місяців тому

    This was just brilliant. You are an incredible teacher. Thankyou.

  • @akash.deblanq
    @akash.deblanq 2 роки тому +2

    I jumped when I understood eT*e concept. Thank you so much!!!

  • @nikhildonthula1395
    @nikhildonthula1395 2 місяці тому

    Man top class stuff, been trying to find mathematical derivation from many days in UA-cam.

  • @gauravpundir97
    @gauravpundir97 Рік тому

    Thank you for making such fantastic videos!

  • @lvilligosalvs2708
    @lvilligosalvs2708 10 місяців тому

    You are a Gem, Sir. Keep it up. Thank you!

  • @user-qo1qe9wq4g
    @user-qo1qe9wq4g 4 місяці тому

    Thank you very much for the explanation sir, I searched the whole youtube to get this mathematical explanation!

  • @anshulsharma7080
    @anshulsharma7080 Рік тому

    Itna saare phle pdha kese bhaiya apne.....
    Suprrrrr se uprrrrrr vla h, wowooooooooo...

  • @todaystrending1992
    @todaystrending1992 3 роки тому +11

    Eagerly waiting for next video😁😉. Thank you so much sir for this🙏❤️

  • @krithwal1997
    @krithwal1997 2 роки тому +2

    Made complex things so easy . i.e, CampusX

  • @nirjalkumarmahato330
    @nirjalkumarmahato330 Рік тому +1

    Boss tusi great ho ❤️ struggling straight from 1 months 🙃

  • @sameergupta3067
    @sameergupta3067 Рік тому

    This ML series is making me interested in maths of Machine Learning algorithms.

  • @usmanriaz6241
    @usmanriaz6241 7 місяців тому

    you are an amazing teacher. never saw with such good explanations on youtube. Love from pakistan

  • @sovansahoo27
    @sovansahoo27 10 місяців тому +1

    Superb content ,easily my semester
    saviour at IIT Kanpur...Thanks Sir

  • @manujkumarjoshi9342
    @manujkumarjoshi9342 10 місяців тому

    Beautiful, luckily I know it before but awesome teaching skills

  • @anant1803
    @anant1803 Рік тому

    Really amazing video sir.

  • @ADESHKUMAR-yz2el
    @ADESHKUMAR-yz2el 11 місяців тому

    love you sir, with all respect.

  • @ArpanChandra-vv8cg
    @ArpanChandra-vv8cg 18 днів тому

    GOD LEVEL TEACHING SKILL 💡💡

  • @balrajprajesh6473
    @balrajprajesh6473 2 роки тому

    Thank you for this sir!

  • @shubhankarsharma2221
    @shubhankarsharma2221 Рік тому +1

    *->Transpose
    y*(XB)=(XB)*Y can be prove after taking XB=Y(hat) and put them with their respective matrices.
    y=[y1 y2 yn]
    y(hat)=[y(h)1 y(h)2...y(n)]
    putting into equation will prove both are same

  • @ParthivShah
    @ParthivShah 4 місяці тому +1

    Thank You Sir.

  • @beethoven6185
    @beethoven6185 9 місяців тому

    Sir you are the best teacher

  • @kiran__jangra
    @kiran__jangra Місяць тому

    No Doubt sir your teaching is so fantastic I am following your videos
    Sir I have one doubt in the step where you do
    X^TXB^T=y^TX
    B^T=Y^TX(X^TX)^-1
    but it shouldn't be B^T=(X^TX)^-1Y^TX basically inverse term must be pre multiplied
    because we pre multiply by inverse term to cancel it out on left hand side and matrix multiplication is not commutative so we can't write it on other side
    Please clear my doubt

  • @sachin2725
    @sachin2725 Рік тому +1

    Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?

  • @user-oc6lw2rd1q
    @user-oc6lw2rd1q 9 місяців тому +2

    34:18 shouldn't its differentiation be equal to (X^T)y which is transpose of X times y instead of transpose of y times X which is (y^T)X.

  • @ashishmhatre2846
    @ashishmhatre2846 2 роки тому +3

    My master's prof. can't explain things better than you ! Thank you for making such awesome videos !

  • @rubalsingh4018
    @rubalsingh4018 7 місяців тому

    Thank you so much.

  • @priyadarshichatterjee7933
    @priyadarshichatterjee7933 6 місяців тому

    SIr... shouldnot after differentiation and reduction we will be left with yT=XTBT which again transposed gives y=XB and there fore B=X^-1y?

  • @maths_impact
    @maths_impact Рік тому +1

    Wonderful sir

  • @ashishraut7526
    @ashishraut7526 Рік тому

    bosss that was awesome

  • @ronylpatil
    @ronylpatil Рік тому

    Please make detail video on matrix diff.

  • @ritujawale10
    @ritujawale10 2 роки тому

    Thank you sir 👍

  • @mr.deep.
    @mr.deep. 2 роки тому +2

    best explain

  • @rupeshramancreation5554
    @rupeshramancreation5554 6 місяців тому

    Bahat badi baat bool di aj ap ne

  • @messi0510
    @messi0510 Рік тому

    0:30-->2:45 intuition behind MLR
    43:20-->47:45 Why gradient descent is more effective as compared to OLS?

  • @abuboimofo6605
    @abuboimofo6605 4 місяці тому

    Doubt Sir ji 🙏
    36:00
    When you differentiate the matrix considering Y = A'XA
    Your answer is 2XA^T ..while the answer should be dY/dA = 2XA not the transpose .. correct me if I am wrong plzz

  • @kidscreator2268
    @kidscreator2268 3 роки тому +1

    1 no. sir

  • @Sara-fp1zw
    @Sara-fp1zw 2 роки тому +3

    36:00 bhiya kindly upload video on matrix differentiation.

  • @ayushbaranwal1094
    @ayushbaranwal1094 5 місяців тому

    Sir actually I had a doubt, d/da of At*x*A is 2XA but you have written 2XA transpose, can you explain it?

  • @abhishekkukreja6735
    @abhishekkukreja6735 2 роки тому +1

    Hi Nitish sir, while calculating the error function we used differenctiation to get the expression but in the very beginnning you said we don't use calculus for ols and for gradient descent we do but we used that in both so how it is closed form or non closed form? whats the concept I got it , but closed and non closed form how they're diff as we're doing differentiation in both of them ?
    Thanks for these videos.

    • @spynom3070
      @spynom3070 Рік тому +1

      he used calculus to show how ols equation is formed from scratch. In ols machine use final equation to calculate best fit line but in gradient descent it use calculus to reach minima point.

    • @abhishekkukreja6735
      @abhishekkukreja6735 Рік тому

      @@spynom3070 thanks for this.

  • @rahulpathak8415
    @rahulpathak8415 Місяць тому

    Loss function 1/2m se start hota hai sir ?

  • @animatrix1631
    @animatrix1631 Рік тому

    Matrix differentiation video please upload sir @campusx

  • @BAMEADManiyar
    @BAMEADManiyar 8 місяців тому

    37:23 sir i have a doubt if i multiply LHS by (X`T X)'-1 i will be left with B'T on LHS so therefore i need to multiply RHS also by same term right. But if i do so i'll get some other answer. why is this sir.

  • @AbdurRahman-lv9ec
    @AbdurRahman-lv9ec Рік тому

    Awsome

  • @harshitmishra394
    @harshitmishra394 5 місяців тому

    @campusx sir Y(hat)=B0+B.X1+B2.X2......Bn.Xn tha to matrix me different element kaise ho gya????

  • @abdulmanan17529
    @abdulmanan17529 Рік тому

    Math guru as well as machine learning

  • @Jc12x06
    @Jc12x06 Рік тому +1

    @CampusX Can someone explain what would happen if the Inverse doesn't exist for that particular matrix in the last step(X^T.X)^-1 i.e. if the determinant is 0?

    • @yashwanthyash1382
      @yashwanthyash1382 Рік тому

      Very nice question but iska answer mujhe bhi nahi patha

    • @rounaksarkar3084
      @rounaksarkar3084 Рік тому

      The reason is simple. See, X is the matrix consisting of features. Now, there are 2 possibilities for non-existance of the inverse of (X^T.X)^-1; first one is X is a null matrix and hence X^T is also a null matrix; second possibility is X^T.X is a null matrix (but none of them is individually null). You can skip the first possibility because if feature matrix is null nobody cares about the problem. Coming to the 2nd possibility, X is a (nx1) and X^T is a (1xn) matrix; X^T.X will be a (1x1) matrix. Now even if some elements of X^T are negative , it will be multiplied with the same element of X ( Notice : ith element of the 1st row of X^T == ith element of the 1st column of X ) . Hence while multiplicating and adding the elements while performing X^T.X you will never come across any negative element. So, addition of all positive quantity will give you a positive (1x1) matrix. Hence, inverse of X^T.X will always exist.

  • @roktimjojo5573
    @roktimjojo5573 2 роки тому

    matrix differentiation ki video kab aaegi

  • @mr.deep.
    @mr.deep. 2 роки тому +5

    campusX > MIT

  • @abdulmanan17529
    @abdulmanan17529 Рік тому

  • @Khan-ho5yd
    @Khan-ho5yd 8 місяців тому

    Hell sir Do you have notes ??

  • @MAyyan-gb6hi
    @MAyyan-gb6hi 10 місяців тому

    ily!

  • @abdulmanan17529
    @abdulmanan17529 Рік тому

    🎉🎉🎉❤

  • @moizk8223
    @moizk8223 2 роки тому +2

    The video is awesome. I have a doubt though. At 37:35 you premultiply the inverse on the LHS but post multiply on RHS. Isn't that wrong? Correct me if I am missing something

    • @readbhagwatgeeta3810
      @readbhagwatgeeta3810 Рік тому +2

      Yes correct ..The final value of beta should be: (( X transpose)Y)((X transpose)X)^-1

    • @moizk8223
      @moizk8223 Рік тому

      @@readbhagwatgeeta3810 thnxx

  • @souviknaskar631
    @souviknaskar631 9 місяців тому

    (AT)-1 = (A-1)T
    using this formula you can prove the last part
    [(XTX)-1]T = (XTX)-1

  • @vanshshah6418
    @vanshshah6418 Рік тому

    (best best best best ......best)^best

  • @Star-xk5jp
    @Star-xk5jp 6 місяців тому

    Day4
    Date:12/1/24

  • @HirokiKudoGT
    @HirokiKudoGT Рік тому +1

    sir , at 36:26 i think you used d/dA( A^TxA) = 2xA^T but its 2xA.... so i'm little confused about the last final result 🫤..only this thing else everything , you are great sir ...love your videos .

    • @rounaksarkar3084
      @rounaksarkar3084 Рік тому

      Actually na, the differentiation is (X+X^T).A^T. If X=X^T then it becomes 2xA^T.

  • @bruhat15
    @bruhat15 7 днів тому

    13:09 matrix multiplication is wrong
    When X is multiplied by B it should give a matrix of single column
    While the matrix described earlier is of m coloums

    • @bruhat15
      @bruhat15 5 днів тому

      I guess the Y hat(matrix before decomposing) should be equal sum of each columns as ine column so Y hat will be of n*1 order
      Then it would be correct