Linear Regression From Scratch in Python (Mathematical)

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ • 161

  • @MrTheBroMoe
    @MrTheBroMoe 3 роки тому +246

    Please do more mathematical implementations like this please! This really helped me to understand the math behind these algorithms

  • @Ossigen8
    @Ossigen8 9 місяців тому +5

    I don’t comment often videos but this one is really good, you explained the math concept really well by doing the examples on the graph etc… Good stuff in this channel !

  • @cambridgebreaths3581
    @cambridgebreaths3581 3 роки тому +55

    I do like all your videos that contain the keywords "mathematical theory" and "from scratch" :). Please do more similar videos. Thank you kindly

    • @samas69420
      @samas69420 3 роки тому +3

      same, stuff from scratch is the best stuff for learning

    • @fater8711
      @fater8711 3 роки тому +3

      @@samas69420 yeah, and it's really fun
      I am always so proud when I am able to code something from scratch

  • @nasr-rsln
    @nasr-rsln Рік тому +3

    this video is amazing, good job, im now actually thinking about re visiting the math classes i couldn't take before in order to get better at these machine learning algorithims.

  • @vardhanr8177
    @vardhanr8177 3 роки тому +10

    AMAZING video!!! I love these videos which teach theory too! Thank youuu!!!

  • @santrix1595
    @santrix1595 Рік тому +3

    dude Slammer video!! love the fact that you made the math interesting and super easy to understand

  • @kapilsonyt
    @kapilsonyt 3 роки тому +3

    I recently stumbled upon this while looking for similar approach via python and I subscribed you now.
    Thank you very much for imparting knowledge.

  • @PML37
    @PML37 7 місяців тому +1

    That was just wow! The way you explained it was amazing.
    Thank you!

  • @SabarishanM
    @SabarishanM 7 місяців тому

    I was searching everywhere and final found what I need. Your video really clears up the fundamentals of creating Linear Regression Model. Thank you

  • @zukofire6424
    @zukofire6424 3 роки тому +8

    Thanks very much for this!! I am a data analysis student and close to giving up but still hanging on!

    • @zukofire6424
      @zukofire6424 4 місяці тому +1

      @kelvinthomas-pr8sd Thank you! I did then but it still apllies now. I now work but have 2 classes I haven't passed yet. Never give up!

  • @TheVerbalAxiom
    @TheVerbalAxiom 3 роки тому +1

    I am subscribing, definitely. You have taught in the most straight forward and explanatory way a concept that other videos made a bit too complicated. Thank you!

  • @ileshdhall
    @ileshdhall 4 місяці тому

    The way you explained and implemented was really efficient and easy, it was really helpful as I am just getting started on my ML journey and after implementing this I feel m a step forward. Thank You!!

  • @curtezyt1984
    @curtezyt1984 Рік тому +1

    on 10:00 we need to take derivative of each weight (M) like this, first for M1 then for M2 then M3.... to Mn

  • @AkhilendraOjhaBME
    @AkhilendraOjhaBME 2 роки тому +2

    Greatly explained in simple words, looking forward to learn more methods from you.

  • @Ryguy12543
    @Ryguy12543 Рік тому +1

    Thank you so much. Im working through problem set for a Neural Analysis class and this really helps. Great video.

  • @kenadams2261
    @kenadams2261 2 роки тому +1

    Amazing video, looking forward for more implementations from scratch!

  • @graphicsguy1965
    @graphicsguy1965 2 роки тому +3

    Thanks man! Awesome video that dumbed it down enough for me. Could you do linear regression of fitting a sphere to points given the sphere radius (or not) like is used in terrestrial laser scanning? If you've never messed with it, you place spheres around what you are scanning, then the software uses the spheres to align all of the different scans at different locations together.

  • @prathamshah9167
    @prathamshah9167 5 місяців тому

    A really insightful video to learn this concept!! I found it very helpful !!
    Just one suggestion @NeuralNine, please include the files (that you display in the video) in the description of the video.
    Everything else is amazing and praiseworthy!

  • @salambostaji9315
    @salambostaji9315 2 місяці тому

    Easy to follow and goes straight to the point, thank you!

  • @unknowncorsairtim
    @unknowncorsairtim Рік тому +4

    Awesome tutorial! Could you please explain why you use gradient descent to minimize squared error instead of using the formula: divide the standard deviation of y values by the standard deviation of x values and then multiply this by the correlation between x and y?

  • @LyamanOsmanlı
    @LyamanOsmanlı 5 місяців тому +1

    hi , really you have the best videos in youtube. Your speaking is also very clear for me although my native language is not English. Thanks you very much it helps me very much , and i am in 17 years old.I want to improve and in my this way you help me.

  • @dragnar4743
    @dragnar4743 6 місяців тому

    Damn !! I like your video so much, especially the implementation part. Most of the people would just directly use sklearn library and call it a day. But, you really have showed how the code actually runs behind the scene. 👍👍💛💛

  • @atrumluminarium
    @atrumluminarium 3 роки тому +8

    It would have been nice to compare it to the analytical solution of least squares regression being (Xᵀ•X)⁻¹•(Xᵀ•Y) just to show they're identical

    • @andreagalloni92
      @andreagalloni92 Рік тому

      quicker and theoretically more correct for solving this problem!

    • @JohnMacPherson-hr4yz
      @JohnMacPherson-hr4yz Рік тому +1

      @@andreagalloni92 could you make a video and share it with everyone please? (theory and python code)

    • @andreagalloni92
      @andreagalloni92 Рік тому +1

      @@JohnMacPherson-hr4yz mmm... Maybe I can :)
      I'll try to do it

  • @happypotato4252
    @happypotato4252 Рік тому

    Excellent explanation of the implementation of linear regression, thanks!!

  • @davidmedina1886
    @davidmedina1886 3 роки тому +2

    This is a great video! Thank you so much!

  • @naitikpatil243
    @naitikpatil243 3 роки тому +2

    Waiting for series of such videos 🤩

  • @lashrack4548
    @lashrack4548 2 роки тому

    You got a new subscriber... Best in detail explanation ever!!!

  • @bluestar2253
    @bluestar2253 3 роки тому +14

    Yes, please do more videos like this one. Even Einstein gives it 2 thumbs up! :)

  • @atharezzeldin2802
    @atharezzeldin2802 5 місяців тому

    I really appreciate the time and the effort to simplify such a thing, Thanks a bunch 🌹🌹🌹🌹

  • @fraggodgaming4433
    @fraggodgaming4433 2 роки тому

    Thank you so much man! Great vid...Keep up the good work!

  • @baivabdattamajumder6568
    @baivabdattamajumder6568 Рік тому

    If you're counting from 0 to n, it means you have (n+1) data points, so should the dividing factor not be 1/(n+1) while calculating mean of squared errors at 4:43?

    • @l-unnamed802
      @l-unnamed802 11 днів тому

      U are right, it should be like that. He just made an accidental mistake.

  • @yxyyy
    @yxyyy 3 роки тому +5

    Next for logistic regression please!

  • @sryps
    @sryps 3 роки тому +2

    love the videos, finally

  • @zezestyles6215
    @zezestyles6215 Рік тому

    Thank youuu :) Loved the way you explained it.

  • @actualBIAS
    @actualBIAS 11 місяців тому

    Bruder, danke dir. Es wollte nicht in meinen Kopf. Dank dir habe ich das verstanden.

  • @mysticrustacean4065
    @mysticrustacean4065 Рік тому

    Brother! I simply love you after I came across this video!

  • @archer9056
    @archer9056 9 місяців тому

    Amazing simple explanation of linear regression. Please also cover the other techniques in same way

  • @deveshjain9543
    @deveshjain9543 Рік тому

    I implemented it myself and it came out to be more accurate then sckit-learn
    import numpy as np
    from sympy import Symbol, solve, diff
    class LinearRegression:
    def __init__(self):
    self.w = {}
    def fit(self, x, y):
    for i in np.arange(x.shape[1] + 1):
    self.w[f"w{i}"] = Symbol(f"w{i}")
    e = 0
    for i in range(len(x)):
    yp = 0
    for j in np.arange(len(self.w)):
    if j == 0:
    yp += self.w[f"w{j}"]
    else:
    yp += self.w[f"w{j}"] * x[i][j-1]
    e += (yp-y[i]) ** 2
    eq = []
    for i in np.arange(len(self.w)):
    eq.append(diff(e, self.w[f"w{i}"]))
    w = solve(eq, list(self.w.keys()))
    for i in np.arange(len(self.w)):
    self.w[f"w{i}"] = w[self.w[f"w{i}"]]
    def predict(self, x):
    def prediction(features):
    yp = 0
    for i in np.arange(len(self.w)):
    if i == 0:
    yp += self.w[f"w{i}"]
    else:
    yp += self.w[f"w{i}"] * features[i-1]
    return yp
    return list(map(prediction, x))

  • @ahmedifhaam7266
    @ahmedifhaam7266 2 роки тому

    dude did my 6 hours lecture + lab class all in just 24 minutes, bruh. Ez Clap hahah, thanks a lot.

  • @aproudkafir7459
    @aproudkafir7459 2 місяці тому +1

    The flow should terminate on global minima right ?? Is this program complete ??

  • @madi6736
    @madi6736 10 місяців тому

    your explanations are really clear and precise, thank you very much♥♥♥

  • @quegon6125
    @quegon6125 8 днів тому

    I have a question regarding the expression (-2 / n). Shouldn't it be placed outside the for loop since it is calculated multiple times, even though it's a constant that should be computed once for the gradient?🙂

  • @saxenayashxi3974
    @saxenayashxi3974 Місяць тому +1

    Loved it .......Thank you so much ❤❤

  • @anonymousvevo8697
    @anonymousvevo8697 2 роки тому

    the math part as awesome! thanks a lot very clear and simple

  • @bharth_yadav
    @bharth_yadav 2 роки тому +1

    yoo out all explanations i saw u did the best work this is what i wanted best work man subbed

  • @omeshnath6826
    @omeshnath6826 Рік тому

    got a good idea about linear regression thnx bud!!

  • @throgmortonartstudio2402
    @throgmortonartstudio2402 2 роки тому

    That was so much easier to understand. thank you.

  • @trido3815
    @trido3815 5 місяців тому

    Excellent explanation. Thanks

  • @drewprof
    @drewprof 9 місяців тому

    Nicely done!

  • @sajid_ahamed
    @sajid_ahamed 3 роки тому

    awesome video and dark mode paint rocksss

  • @donowen9848
    @donowen9848 Рік тому

    Awesome representation!!!

  • @thomashoughton228
    @thomashoughton228 3 роки тому +1

    Why does 1/n become -2/n. As I read in towards data science that it would be 1/n and then the -2 would be after the sum symbol as 2x'i' so like why or how does 1/n become -2/n. Also great video btw, helped me unlock a few skills in the ml skill tree :)

  • @Meerkat_33
    @Meerkat_33 13 днів тому

    where can I get the dataset? This is all new to me and I can't figure out how to get a test script going on my end, I keep getting dead ends

  • @amirnaeej7986
    @amirnaeej7986 Рік тому

    You’re a life saver thank you soooo much❤❤

  • @mohammadf9646
    @mohammadf9646 10 місяців тому +1

    well done

  • @lamiyanabaova3020
    @lamiyanabaova3020 2 роки тому

    Thanks a lot for great explanation!

  • @JonasHolzbrecher
    @JonasHolzbrecher Рік тому +1

    This is amazing!
    How can I store the current m and b values for every iteration?

  • @chadstrachan9696
    @chadstrachan9696 Рік тому

    Great video, very informative

  • @roomian
    @roomian Рік тому

    I dont understand the reasoning behind [m * x * b for x in range(20, 80)] why is this the y axis. What does it represent. Why the equation? Also why arent we just reading the found values of x and y? I have understood all the calculations being done but I dont understand the visualization/graph representation part. Please let me know if you see this.

    • @OfficialMazLi
      @OfficialMazLi Рік тому

      He did it all wrong just ignore it imo

  • @vallabhshelar3176
    @vallabhshelar3176 2 роки тому

    loved it bro u r just amazing

  • @felixliang7558
    @felixliang7558 4 місяці тому

    Great video!

  • @dronedangwal447
    @dronedangwal447 Рік тому +1

    Nice video! Could you please provide the dataset that you used

  • @allenallen5136
    @allenallen5136 Рік тому +1

    Combine theory with practice,and show the calculate procedure,really good👍

  • @minuet6919
    @minuet6919 3 роки тому

    Lmao, I finished summer school for algebra 1 and we just learned this and to calculate it, we used a calculator this is a amazing video the mathematics where correct and your explanation is amazing!

  • @steviegillen82
    @steviegillen82 5 місяців тому +1

    Excellent video, although I will need to scrub up on my mathematics!! I think I am one of the few that loves looking at formulas and wonders; "What in the hell is going on there?" 😆😆

  • @kuroisan2698
    @kuroisan2698 Рік тому

    great video NeuralNine
    I feel this the gradiant descent. is not it ?
    if It is, is there an implementation for the least square
    because I feel it is just so random you take some values and you choose the least one from those random values not necessarily the least value you can get

  • @shashwatbalodhi4042
    @shashwatbalodhi4042 Рік тому

    I am getting error
    AttributeError Traceback (most recent call last)
    Cell In[10], line 39
    37 if i % 50 == 0:
    38 print(f"Epoch: {i}")
    ---> 39 m, b = gradient_descent(m, b, data, L)
    41 print(m, b)
    43 plt.scatter(data.studytime, data.score, color = "black")

  • @piyush9555
    @piyush9555 2 роки тому

    wow, that is exactly what is called simplicity

  • @milorday5141
    @milorday5141 2 роки тому

    Thank you, very useful and clear.

  • @quaka96
    @quaka96 3 роки тому +1

    Why the gradient descent method? The pseudo inverse solves this immediately

    • @CptAJbanned
      @CptAJbanned Рік тому

      Make a video about it and link me, plz

  • @farukesatergin8129
    @farukesatergin8129 2 роки тому +1

    Where can we get the csv file he worked on?

  • @ertugrulturkseven7565
    @ertugrulturkseven7565 5 місяців тому

    where can we find the data you uploaded. I would be happy to try the code. thanks for making these videos.

  • @fishfanaticgokul4758
    @fishfanaticgokul4758 Рік тому

    bro how does the model created by these mathematical output and how does the predict function works
    ;

  • @sneha.tiwari
    @sneha.tiwari 2 роки тому

    Hello, do you have a video or notes on plotting the linear regression when attributes in the input data points are more than one. Say 10 columns of x, and 1 column of y??
    Please respond.
    Thank you.

    • @sairekhaunnam3001
      @sairekhaunnam3001 2 роки тому

      I too need explanation on this. It will be really helpful

    • @sneha.tiwari
      @sneha.tiwari 2 роки тому

      @@sairekhaunnam3001 Hey, if there is one attribute the we can plot it on 2D, if two attributes, then in 3D, and for three attribute, we will plot in 4D, which is not possible visually. That's why we restrict ourselves.

  • @Graverman
    @Graverman 3 роки тому +1

    here to learn something that I already know but the video is too entertaining to click off

  • @debashishsarkar649
    @debashishsarkar649 Рік тому +2

    Hi sir , it's really helps me to understand linear regression and build it by myself from scratch. Thanks 👍 can you describe all the models from scratch? Like it's

  • @phoneix24886
    @phoneix24886 Рік тому

    why did we do a partial derivative?

  • @shubhamchoudhary5461
    @shubhamchoudhary5461 3 роки тому

    that's pretty good lecture.. 😍

  • @adejumoadeniyi5152
    @adejumoadeniyi5152 2 роки тому

    This was really helpful

  • @munazirhussain5450
    @munazirhussain5450 11 місяців тому

    Which software you use for the video ?

  • @ashenbandara6903
    @ashenbandara6903 5 місяців тому

    amazing crystal clear

  • @nilothpalbhattacharya8230
    @nilothpalbhattacharya8230 Рік тому

    this is the real algorithm. great :)

  •  2 місяці тому

    the defined lossfunction is not called anywhere ,why

  • @annawilson3824
    @annawilson3824 2 роки тому

    What's the point of printing Epochs if we see nothing on the screen in that regard lol, why is it highly mathematical if we do not even derive anything? I would redo the video with loss_function being in the print, o/w it just hangs in the code for God knows what reason.

  • @anidea8012
    @anidea8012 2 роки тому

    please provide the dataset which you use bcz it will us to follow along with you

  • @lucasgonzalezsonnenberg3204

    Hi, I really like your videos. One question, what is h?
    Thank you!

  • @sriram2k4
    @sriram2k4 2 роки тому

    Amazing video!!!!

  • @gamingwithdingo
    @gamingwithdingo 7 місяців тому +1

    I love this guy

  • @shaileshchauhan7855
    @shaileshchauhan7855 2 роки тому

    Why did you decrease the no of epocs ? And doesn't minima can be found with partial derivative =0 ?

    • @CptAJbanned
      @CptAJbanned Рік тому

      The minima of one of the partial derivatives wont necessarily be the minima for the loss function since it also depends on the other variables/features

  • @tusharvashistha190
    @tusharvashistha190 Рік тому

    amazing content

  • @muhammadhammadsarwar698
    @muhammadhammadsarwar698 8 місяців тому

    Can Anyone tell me some practical usecase or some example where we can implement this? It should be great if anyone can give me full scenario

    • @lsgamingff6695
      @lsgamingff6695 Місяць тому

      as of it own this wont be any useful but as we keep learining we can def find some more models which used in predicting some many useful things but as of this right now it is not useful

  • @kaleemahmed3300
    @kaleemahmed3300 3 роки тому

    Great explanation
    Upload more model implementation plz

  • @mehdibouhamidi4675
    @mehdibouhamidi4675 3 роки тому

    can you do the same for Support vector machine

  • @eslamashraf5847
    @eslamashraf5847 Рік тому +1

    Amazing explaining, thanks a lot!
    Can you upload the csv file and send me the link, please ?

  • @naturemusic7391
    @naturemusic7391 2 роки тому

    there is a improvement need to be done sir while discussing about error you are constantly saying the function as mean squared error right! but this is not the error function the function which need to minimize is the cost function. loss or error function measure the error of single training set while cost function is the summation of all the error of the training set.

  • @navardowilliams7484
    @navardowilliams7484 Рік тому

    You sound like tech with Tim😂😂😂

  • @MultiCodFTW
    @MultiCodFTW 2 роки тому +1

    @NeuralNine How do I test the trained models to give a prediction based on a newly given value/independent variable (a value that is foreign to the training set)?
    Great video btw!

    • @adrianbitsinnie1537
      @adrianbitsinnie1537 Рік тому

      Yes! This is a good question I hope he answers. If he already has can someone link me please? :)

  • @keyurpancholi4277
    @keyurpancholi4277 2 роки тому

    where is the csv file?

  • @rockymani94
    @rockymani94 3 роки тому

    Lstm ,rnn , logistics and more we are expecting more from you

  • @davidlanda2324
    @davidlanda2324 Рік тому

    You should not to "think" that it is a best line, you should verify it!