How I think about Logistic Regression - Part 1
Вставка
- Опубліковано 26 лип 2024
- A (hopefully) simple and intuitive explanation of logistic regression for binary classification.
Intro and Overview 00:00-02:15
How to think about the Threshold 02:15-04:35
Assigning Likeliness 04:36-08:14
Maximum Likelihood 08:15-11:02
Finding the Best Threshold 11:03-11:34
Recap 11:35-12:35
Part 2: • How I think about Logi...
The Math Behind Logistic Regression: • How I think about Logi...
Part 3: Coming Soon!
Gradient Descent Video: • How I think about Grad...
Some Reading on Probability: / overview-of-probability
Visualization and animation code on GitHub: github.com/gallettilance/repr...
Thumbnail by / endless.yarning
#machinelearning #logisticregression #education #classification #optimizationalgorithm #explained #mathexplained #machinelearningalgorithm #mathformachinelearning #machinelearningbasics #datascience #datasciencebasics #linearregression #probability #probabilitytheory - Наука та технологія
FKING GODSEND… BLESS UR LIFE… BLESS UR FAMILY… U’RE A GREAT TEACHER
this made my day thank you!! :)
Wow, this channel needs to explode in views!
That was a pleasure to watch.
So glad to hear it!
Yet another one of those videos that, in your head, you think have 100k+ views but it turns out we're just lucky to be here first.
🙏🙏🙏 that’s so encouraging and kind! Thank you!
This is actually also a good introduction to Maximum Likelyhood Extimation even though you didn´t mention the method explicitely.
@@Giovimax98 Thanks! Yeah I found the name intimidating and distracting more than helpful
This was so helpful! A really great, new perspective (and way of explaining it)
Awesome!
really great video
Great video, excited for more
Thank you so much for watching!! Part 2 (and 3!) should be out real soon
Great video
thanks so much!! I hope you enjoy part 2 when you get around to it :)
Awesome. Question: why would you be adjusting the constants i.e in the hours/exam length basic chart? The final probability went up but what does that two-fold increase represent in the real world, not math language?
Thanks so much for watching!! And great question!
As you change these parameters you generate different probabilities across your space allowing you to describe it better. (and thus make better predictions etc). So it's less about what these parameters **mean** and more about what they let you **do** (which is to mold the sigmoid function to the data).
In logistic regression there is an interpretation of the parameters as increases to the log-odds but that's pretty mathy and as far as I can tell just happenstance and not by design (happy to elaborate on this if you want). In probit regression for example there exists no such interpretation but the mechanism is the same. Similarly for Neural Networks.
@@howithinkabout ah I see. It makes sense that you'd want to shape the sigmoid by manipulating the parameters, intuitively it's still hard to convert the abstracted constant value into a tangible example. But as Von Neumann said "in mathematics you don't understand things. You just get used to them".
Waiting eagerly for the next vid!
@@mrjackrabbitslim1 I'm more of the opinion that if you don't understand it, someone's not explaining it well enough :) I made an animation just for you github.com/gallettilance/reproduce-those-animations/blob/main/examples/linear_function.gif to demonstrate what happens to the threshold as you change the parameters. This lets you rotate, shift, and center the sigmoid function. The constant specifically is responsible for shifting things. I'll try to make this more clear in part 2 - thanks for sharing your thoughts!
@@howithinkabout wow, thank you!