🐶 Snyk is free forever. Sign up with my link snyk.co/pwnfun... ⭐ GitHub: github.com/Pwn... 🐤 X: / pwnfunction 🧰 Tools used are: tools.pwnfunct... 🎵 Track: STRLGHT - Destination
This is the canonical problem of least squares in optimization theory, some lineal algebra can help us skip the iteration problems and find the best vector space of 1 or n dimension that fits the r^n -> 1 problem to optimize. Great video btw.
Linear regression is usually done with help of linear algebra methods for matrix factorizations, by methods of least squares. Gradient descent is usually used when no noniterative methods can be used -> like fitting ML models
Woa, watching this before 8am... this was too much thinking for me this early :) needs a warning at the beginning hah! Oh man, just brings back memories from College. My nephew texted me the other day, asking if calculus was important for programming. I think you came up with a decent example of this in use. I've never had to use calculus in over 20 years of programming, but I do think that, depending on the job, some will use it. My best advice/guidance to him was that all of the math that I took in college really helped with problem solving, and really taught me to think at a very deep level.
Please make more such videos, as you said covering gradient descent. I really wanna watch an explanation and application of Neural networks by you. I know it might be a long video, and hard to follow through, but just, train us, like bit by bit, teach us from the ground up. Keep this up. Also, can you please provide a list of topics that should be learned, to reach the complexity of neural networks? Like a roadmap for machine learning concepts like this, that would really help out. Thanks. Love your videos❤.
I know elementary school linear regression but I'm lost immediately at 6:30 with this weird python notation. We're unpacking a list, zipping it together, and then casting it to a list again?
Show me how to make such a model for casino crash games. Based on the time of the end of the rounds and its coefficient. We will be very grateful to you.
There's no way my man just did gradient descent when there's a simple closed-form formula for linear regression lmao. Fair enough if you want a simple example, but I feel like this neither shows the power of gradient descent nor an efficient way to find a line of best fit.
A video can have no sound effects or editing at all, but as soon as money is mentioned there got to be that cha-ching sound... At this point I suspect this is a UA-cam bug.
Fantastic job, its a wonderful content the kind of depth and clarity you have about concept is commendable and you ignited the curiosity of maths. If possible please make a video on how you learn things ond depth and breadth. Seriously wonderful job dude. 🫡
Or you can just look at the principal eigenvector of the covariance matrix ;) Edit: actually, its way simpler (you can just analytically solve for d/dm = 0 straight away): m = y' . x'/ x' . x' (dot prod) b = - m where x' = x - and is the average of the vector.
Dude, I can't stress enough how amazing I felt when I saw that you uploaded a video, it's been a while, hope you are doing great!
still breathing … good
This is the canonical problem of least squares in optimization theory, some lineal algebra can help us skip the iteration problems and find the best vector space of 1 or n dimension that fits the r^n -> 1 problem to optimize. Great video btw.
You are alive .
Was waiting for you.....
Welcome back😊
Linear regression is usually done with help of linear algebra methods for matrix factorizations, by methods of least squares. Gradient descent is usually used when no noniterative methods can be used -> like fitting ML models
how is the new outro?
Loving it, fantastic job
the intro is soo good!! dont change
and also thats not simple maths cz i don't remember anything
is that what made you take so long?
yes its good
dude i missed your videos
Don't let the channel die.
I'm learning to code on Odin, and bookmarked it.
Soon I'll be able to get it.
finally the penguin is back, need more content!
Woa, watching this before 8am... this was too much thinking for me this early :) needs a warning at the beginning hah! Oh man, just brings back memories from College. My nephew texted me the other day, asking if calculus was important for programming. I think you came up with a decent example of this in use. I've never had to use calculus in over 20 years of programming, but I do think that, depending on the job, some will use it. My best advice/guidance to him was that all of the math that I took in college really helped with problem solving, and really taught me to think at a very deep level.
What a great way to start a (lunar) new year! Welcome back!
200k subs! Congratulations!
Finally the man who taught me randomness is back
Please make more such videos, as you said covering gradient descent.
I really wanna watch an explanation and application of Neural networks by you. I know it might be a long video, and hard to follow through, but just, train us, like bit by bit, teach us from the ground up. Keep this up. Also, can you please provide a list of topics that should be learned, to reach the complexity of neural networks? Like a roadmap for machine learning concepts like this, that would really help out.
Thanks.
Love your videos❤.
Love the simple explanation and visuals to explain these problems. Really helps!
its a good day when pwn uploads
Heyyy! Really nice to see you back bro! 😊
Hey! Welcome back! Nice to see you again.
Welcome Back Man. Was missing you.
babe, wake up, pwnfunction posted!
Yo! How do you make your videos? What software do you use? Where did you learn to use it? And, how much time did it take for you to actually learn it?
Hello, would you like to share the name of software you're using for editing videos? They looks quite awesome!
This bro is the real top G. You know man..
14:58 indent error line 57
This is why i hate python
damnn a video after a year, less gooo
Welcome back! 🎉
THE GOAT IS BACKKK
The UA-cam compression hates your gradiants and your axis
feels good seeing you alive
The legend is back
Hi Pwn, the link (and its label) in the description are from the previous video, you forgot to change them when pasting.
My bad, thanks!
Woow...i thought i was dreaming. Welcome back
Amazing content, keep it up!
How did you know you had to subtract the learning_rate * dedm and learning_rate * dedb when adjusting m and b? Why not add?
Nice to see you're alive...
The legend is back!
please make more content like this
Using machine learning for linear extrapolation is like hunting ducks with a GAU-8 gatling gun
oh, so you're alive? cool
cool vid! whats that code theme?
you would be a great teacher
Any reason for 'm' instead of 'a'.
I know
y = ax+b
Variable names are arbitrary?
Bringg more videosss
PWN STILL ALIVE!
I know elementary school linear regression but I'm lost immediately at 6:30 with this weird python notation. We're unpacking a list, zipping it together, and then casting it to a list again?
Think of zip as transpose. And since zip has lazy execution (it's a generator), the outer list forces it to run and actually produce the data.
long time no see bruh
sounds like least squares but overcomplicated for the sake of buzzwords
What happened to the intro? I liked it soo much
yooo he's back
Omg he's back
at 1:15 the line should rotate around x=0
quality content
Show me how to make such a model for casino crash games. Based on the time of the end of the rounds and its coefficient. We will be very grateful to you.
he back :D
I thought you were dead
Wtf
There's no way my man just did gradient descent when there's a simple closed-form formula for linear regression lmao. Fair enough if you want a simple example, but I feel like this neither shows the power of gradient descent nor an efficient way to find a line of best fit.
pro tip for a beginner content creator :) don't do gradient background, the banding over yt bitrate is horrible ^^
Bro forgot indent at line 56 15:01
Why no new videos?
Where have you been?
you should keep the glasses. thats much better. 3:11
A video can have no sound effects or editing at all, but as soon as money is mentioned there got to be that cha-ching sound... At this point I suspect this is a UA-cam bug.
hi pwn
👋
"y" is almost the same as "4" in this font
i was waiting for a pwning video :/
Missing the old voice bro, 😭
Oh god... please don't use gradient descent for linear regression.
Fantastic job, its a wonderful content the kind of depth and clarity you have about concept is commendable and you ignited the curiosity of maths.
If possible please make a video on how you learn things ond depth and breadth. Seriously wonderful job dude.
🫡
Or you can just look at the principal eigenvector of the covariance matrix ;)
Edit: actually, its way simpler (you can just analytically solve for d/dm = 0 straight away):
m = y' . x'/ x' . x' (dot prod)
b = - m
where x' = x - and is the average of the vector.