Matrix Form Simple Linear Regression

Поділитися
Вставка
  • Опубліковано 29 чер 2024
  • In this video I cover the Matrix Formulation of the Simple Linear Regression Model. I provide tips and tricks to simplify and emphasize various properties of the matrix formulation. Of particular note are the matrices XTX and XTY as well as (XTX)^-1
    #SLR
    #Matrix
    #Econometrics
    0:00 Introduction and Design Matrix
    02:00 Beta Hat Formula
    02:42 The matrix X'X
    06:24 Inverse of X'X
    09:28 The matrix X'Y

КОМЕНТАРІ • 27

  • @waldoungerer9851
    @waldoungerer9851 4 місяці тому +3

    Yes, my boer brother! You give us a good name. Well educated

  • @imdadood5705
    @imdadood5705 3 роки тому +5

    I am data Science student. I wanted to see how this was derived. This summed it up perfectly! Thanks

  • @pratik.patil87
    @pratik.patil87 9 місяців тому +1

    I was looking for this breakdown for a long time. Thanks a lot mate

  • @viveksavita06
    @viveksavita06 2 роки тому +3

    I have seen many videos but this one explain a bit in more details like formulas used in linear regression. Great work!

  • @scorpio19771111
    @scorpio19771111 2 роки тому +3

    Excellent explanation and demo! 👏🏻👏🏻👏🏻 Thank you so much.
    This video deserves to be much much higher in youtube search results!

  • @changeme454
    @changeme454 7 місяців тому

    Wow! I have been searching this lessons! I just found it and understand your easy way of explanation. 🙏 I keep flowing your channel.

  • @meghajessica
    @meghajessica Рік тому

    One of the best education videos!!!!! GOD BLESSSS YOUU & YOUR BEAUTIFUL FAMILY BROTHER !!!!!!!!! PLS KEEP UP THE GOOD WORK!

  • @robharwood3538
    @robharwood3538 3 роки тому +5

    Nice summarization and explanation of what the matrix form for simple linear regression model looks like, what it is made from, and how it can be constructed. Thanks! You have a good presentation/teaching style, IMO. I hope your channel grows, to help more people. Thanks again!

  • @chariezwane3981
    @chariezwane3981 3 роки тому +3

    Thank you so much for making this video. You just saved my Econometrics behind today.

    • @BoerCommander
      @BoerCommander  3 роки тому +1

      Glad to help Charie! It brings me joy to know that I am helping.

  • @myeshafarzanatahi9154
    @myeshafarzanatahi9154 6 місяців тому +1

    Thank you so much

  • @effortlessjapanese123
    @effortlessjapanese123 Рік тому

    hey! you were in my recommendation again!

  • @mannur2248
    @mannur2248 3 роки тому +2

    Keep up the neat work. Really good work.

  • @liamhoward2208
    @liamhoward2208 2 роки тому +1

    Great and Elegant explanation. Thank you

  • @SNawaz-bk7py
    @SNawaz-bk7py Рік тому

    Thanks man. You are a lifesaver

  • @BoerCommander
    @BoerCommander  3 роки тому +2

    0:00 Introduction and Design Matrix
    02:00 Beta Hat Formula
    02:42 The matrix X'X
    06:24 Inverse of X'X
    09:28 The matrix X'Y

  • @AakashSingh-qu8hk
    @AakashSingh-qu8hk Рік тому

    This helped a lot Thank you so much

  • @rachadlakis1
    @rachadlakis1 2 роки тому +1

    Thanks!!!!

  • @gcumauma3319
    @gcumauma3319 2 роки тому +2

    Thanks Boer

    • @BoerCommander
      @BoerCommander  2 роки тому

      It is my pleasure to be of service gcuma.

  • @effortlessjapanese123
    @effortlessjapanese123 Рік тому

    this is amazing! where are you from?

  • @lofibeatz990
    @lofibeatz990 Рік тому

    I did not understand 8:52 how was sum of xi sqared and n times xbar squared equal to sum of xi-xbar squared it should be xi^2 -xbar only let me explain you with an example if suppose x is -2 and mean is 1 then xi^2 - xbar would give you 3 while (xi-xbar)^2 will give you 9 that is (-2-1)^2 ..... on expanding (xi-xbar)^2 would be xi^2 - 2xixbar +xbar^2 not xi^2 - xbar

  • @janslesp
    @janslesp Рік тому +1

    Somethig strange:
    the equation ...
    BETA_HAT=X.I@Y
    ...in Python (Numpy) give us the same solution of ....
    BETA_HAT=(X.T@X).I@X.T@Y.
    Try it!

    • @BoerCommander
      @BoerCommander  Рік тому +1

      Hi janslesp, a good foray into the calculations and good work on applying the thinking in python. Have a go with a design matrix x that is not square and you will see that the equation does not work then.
      Your equation works if the matrix X is an invertible square matrix but it will not work when X is not square.
      try the example below.
      x = np.array([[1,2,3,3],
      [1,5,7,34],
      [1,4,3,6]])
      y = np.array([2,6,7])
      # beta_hat runs fine
      beta_hat = np.linalg.inv(x.T @ x) @ x.T @ y
      # this line of code will raise an error as x is not #square
      beta_hat_2 = np.linalg.inv(x) @ y

    • @janslesp
      @janslesp Рік тому

      @@BoerCommander Hello Boer. I'm trying to understand these rules of this wonderful universe of linear algebra, Python and Machine Learning. You have no idea the pleasure of exchanging information with talents from other countries. I'm not a data science professional and I'm trying to overcome my difficulties. Thank you. Today I went back to playing a little on the computer. I used the code below which, as a rule, my X is not invertible matrix because X is not square and therefore not invertible. But I dont undestand: numpy makes the method ".I" in this case!! Does numpy have any algorithm to calculate a X.I matrix that simulates an inversion? Note that in the final result we get the same result: tetas_strange_equation = tetas_Normal_Equation
      The code is:
      import numpy as np
      X=np.matrix([
      [1,35,70,0],
      [1,15,50,0],
      [1,42,80,0],
      [1,25,70,1],
      [1,28,90,1],
      [1,12,65,0],
      [1,34,72,0]])
      y=np.matrix([90, 35, 98, 70, 62, 32,68]).T
      # Strange equation
      tetas_strange_equation=X.I@y
      print(tetas)
      # Normal equation
      tetas_Normal_Equation=(X.T@X).I@X.T@y
      print(tetas_Normal_Equation)
      OUTPUT:
      [[21.77437371]
      [ 2.47318883]
      [-0.37736477]
      [ 8.8753038 ]]
      [[21.77437371]
      [ 2.47318883]
      [-0.37736477]
      [ 8.8753038 ]]