Logistic Regression Details Pt1: Coefficients

Поділитися
Вставка
  • Опубліковано 22 лис 2024

КОМЕНТАРІ • 817

  • @statquest
    @statquest  5 років тому +84

    Correction:
    15:21 The left hand side of the equation should be “log(odds Obesity)” instead of “size”.
    NOTE: In statistics, machine learning and most programming languages, the default base for the log() function is 'e'. In other words, when I write, "log()", I mean "natural log()", or "ln()". Thus, the log to the base 'e' of 2.717 = 1.
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @borispenaloza6788
      @borispenaloza6788 5 років тому +1

      Josh.. thanks for these videos man.. one question did you mean obese = log(odds normal_gene) x B1 + ... instead of size = log(...???

    • @statquest
      @statquest  5 років тому +1

      @@borispenaloza6788 What time point in the video (minutes and seconds) are you talking about?

    • @borispenaloza6788
      @borispenaloza6788 5 років тому +1

      @@statquest Starting at 15:23.. you mentioned obesity but the equation shows size...

    • @statquest
      @statquest  5 років тому

      @@borispenaloza6788 Ahh, I see. That's a typo.

    • @borispenaloza6788
      @borispenaloza6788 5 років тому +2

      @@statquest yes.. but it was a great explanation!

  • @kuntalnr
    @kuntalnr 3 роки тому +410

    I am very emotional when writing this. I was struggling to learn logistic regression until I came to this channel and it has really transformed my understanding and confidence. I love how this channel uses visual tools and graphs to explain the concepts instead of some heavy dose of equations. This channel is a blessing to students like us who struggled during pandemic with classes.

    • @statquest
      @statquest  3 роки тому +46

      Hooray! I'm glad the video was helpful! :)

    • @sattanathasiva8080
      @sattanathasiva8080 3 роки тому +6

      Thi is sooooo true, this channel is a bless for students like us and the way your r explaining with practical example is like I've found my heaven. Many many thanks for these videos. You are one of my best teacher in stats.

    • @roachspray
      @roachspray 3 роки тому +3

      im with u on this, the pandemic has made me lose so much of motivation in my studies but we can always bounce back :) lets get thru the semester together!!!

    • @TheyCalledMeT
      @TheyCalledMeT 3 роки тому +3

      your entire story underlines the question of why do we ACTUALY need a university .. it costs a fortune and the professors tend to explain it worse than a YT video ...
      ofc .. there are fields where it's much much harder to put all the stuff into a short well done video .. but oof .. the more i learn on the job from other fields than mine .. the more i get the impression universities should be used to support learning not to be the be all and end all of education .. especialy not when it's increddibly expensive and or utter bs people study (f.nist glaciology for example..)

    • @asmojo5125
      @asmojo5125 2 роки тому +7

      @@statquest DOUBLE BAM !!

  • @jacktesar15
    @jacktesar15 4 роки тому +275

    Josh, I just want you to know you are the only reason I will graduate from my MS in Stats program

    • @statquest
      @statquest  4 роки тому +14

      Wow! Good luck!

    • @MrSpiritmonger
      @MrSpiritmonger 4 роки тому +25

      yea man, I took biostats three times in my life (undergrad, masters, PhD) and the only time things REALLY made sense at the intuitive level is watching StatQuest explanations.

    • @SpecialBlanket
      @SpecialBlanket 4 роки тому +5

      @@MrSpiritmonger I have a graduate degree in pure math and I'm on here watching these so I can learn how to succinctly summarize things to my nontechnical boss (in this case I actually don't know the concept at all and this time it's for me, but that's how I found the channel)

    • @temptemp6222
      @temptemp6222 3 роки тому

      Saaame!

    • @Aziqfajar
      @Aziqfajar Рік тому

      How was it?

  • @kritisk1
    @kritisk1 3 роки тому +9

    Josh, ur my stats savior. Everyone start teaching very serious "let's get to the topic" but u soothe us with that gentle guitar beat and make us feel so comfortable. I just don't know how to express my gratitude, still thanks a ton and I mean it

    • @statquest
      @statquest  3 роки тому +1

      Thank you very much! :)

  • @aarthihonguthi1208
    @aarthihonguthi1208 22 дні тому +2

    6 years down the line, and your videos are still the best:) Thanks Josh❤

  • @adityamankar8910
    @adityamankar8910 4 місяці тому +4

    This is insane. Logistic Regression is one of the fundamental regression algorithms, yet no one is able to explain it with such clarity.

  • @中国杨
    @中国杨 2 роки тому +10

    Can’t believe I’m saying this. Right after I finished watching a video of someone doing fancy snowboarding tricks, I started binge-watching your statquest videos and I got so addictive... I started hating stats for the last term because of a boring prof but you saved my butts!!!

  • @amandineg2911
    @amandineg2911 4 роки тому +111

    You're an absolute gem of human being. It takes a special talent and surely a lot of work to be able to explain these concepts so clearly ! Thank you so much for sharing all of this with us :)

    • @statquest
      @statquest  4 роки тому +9

      Thank you very much! I really appreciate your support - it means a lot to me. :)

  • @AnilJacobs
    @AnilJacobs 4 роки тому +3

    Absolute Stunner! I sat through hours of classroom lectures and mentoring yet couldn't understand it simple enough like you explained on this video. Thank you!

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @stats2econo
    @stats2econo Рік тому +6

    Respected sir. I teach econometrics to scholars with minimal fees to support them in research. For every batch in mle and logit model I share your videos with full confidence. We all are thankful to you. I can see your dedication and love for your work. Thank you so much.

  • @mostinho7
    @mostinho7 4 роки тому +21

    4:50 in linear regression, y axis can have any value (which makes it easier to solve) but for logistic regression y values confined between 0 and 1 as it represents the probability. To deal with this, we transform the y axis to be log odds instead of probability
    Transforming probability to log odds using the logit function, logit = log(p/(1-p))
    5:50 how the axis changes
    The old y axis (in probability) that went from 0.5 to 1 goes to 0 to infinity (in log odds)
    8:00 the log odds y axis transforms the curved plot to a linear plot
    8:20 the coefficients of logistic regression are for the linear plot of log odds axis
    8:48 you get coefficients of the kind just like linear regression
    9:30 don’t understand, check odds ratio stat quest

  • @younghoe6849
    @younghoe6849 4 роки тому +1

    Not easy to find a teacher who can explain in this way. Really talented teacher.

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @hcgaron
    @hcgaron 6 років тому +3

    Your channel is truly excellent. I watch these videos, then read my textbook, then my lecture, finally complete / code my results. It’s proving very helpful.

    • @statquest
      @statquest  6 років тому +1

      Thank you! I'm glad my videos are so helpful. :)

  • @rankzkate
    @rankzkate 4 роки тому +3

    Am on a StatQuest Marathon.Addicted to this videos.Glad I found them. Asante sana.Much love from Kenya

    • @statquest
      @statquest  4 роки тому

      Wow! Thank you very much! :)

  • @nguyentruongaccacfa5538
    @nguyentruongaccacfa5538 Рік тому +2

    Hi Josh, I'm from Vietnam. I have read a lot of literature related to econometrics but it is only after watching this video that I really understand what maximum likelihood estimation is. This is really the pinnacle of Mathematics!

  • @myip05
    @myip05 2 роки тому +1

    kudos to you, not everyone learns the same, you show everything explicit and on simple terms. Thank you for doing this.

  • @hgr126
    @hgr126 10 місяців тому +2

    Over the past couple months I have been self learning stats and data science and struggled a lot even though I genuinely loved learning about them. I am a visual learner so it was so hard to comprehend many simple concepts sometimes. I spend many hours a day on my own accord studying, and most of them are wasted on trying hard to understand a concept visually. Even though ChatGPT saved me a lot, they have so much trouble visualising concepts. Now I chanced upon your channel, I am just honestly dumbfounded by how visual your explanations and how good they are. Just speechless, and I just wanna leave a comment of appreciation. I sometimes feel proud of myself being able to visualise simple concepts myself, but those are nothing compared to your visualisations. This amount of visualisations especially for complex topics 1000% requires a certain talent and an extreme understanding of the concepts. Wishing for your continued success ahead

    • @statquest
      @statquest  10 місяців тому +1

      Thank you very much and good luck with your studies!

    • @hgr126
      @hgr126 10 місяців тому +1

      ​@@statquest Thanks! 2 days of watching several videos, I couldn't help it but decide to support you by just placing an order for your book from amazon today. :') ps: i genuinely dont buy books often, maybe once in every few years. That's how grateful i am. Thank you Josh

    • @statquest
      @statquest  10 місяців тому +1

      @@hgr126 Awesome! Thank you very much for your support!

  • @jiayiwu4101
    @jiayiwu4101 3 роки тому +1

    When I first watched your linear reg videos, I think they are not as good as probability videos. After I watched your linear + logistic reg videos, I know I was totally wrong. You link those two brilliant things together in such an easy and intuitive way. So BRILLIANT!!!

    • @statquest
      @statquest  3 роки тому

      Wow! Thank you very much! :)

    • @jiayiwu4101
      @jiayiwu4101 3 роки тому

      @@statquest If you could add assumptions explanations about linear and/or logistic regression, that would be fantastic. A lot of interviews, from financial quant to tech industry, they love to ask those.....

  • @שיגלר-ח8ת
    @שיגלר-ח8ת 5 років тому +8

    StatQuest, you are absolutely the best video material on UA-cam!! It's funny but it's also in-depth and complete. I wish I could learn all of my academic courses with you.

  • @danieljohnson220
    @danieljohnson220 3 роки тому +5

    i am currently doing a degree in Comp Sci and AI and am regularly referring to your videos as they explain the concepts so well. Really useful channel!

  • @p.a.sierra3526
    @p.a.sierra3526 4 роки тому +13

    I'm from Chile, and i just want to say you: YOU ARE AN AMAZING TEACHER, PERIOD!

  • @shubhamlahan
    @shubhamlahan 4 роки тому +32

    JOSH, just like your videos, your music is incredible. Thank you for all the efforts you put in. Quadruple BAM !!!

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much! :)

  • @joanatomeribeiro
    @joanatomeribeiro 4 роки тому +6

    Thank you so much! Your clarity is brilliant! If all the teachers in the world explained like you, there wouldn't exist such a thing as bad students.

  • @RaviShankar-jm1qw
    @RaviShankar-jm1qw 4 роки тому +7

    Could not resist joining your channel after seeing this video. Damn, you are genius Josh and indeed a blessing for people like us who get overwhelmed by Statistics due to the heavy theory prevailing everywhere. Absolutely loved the numbers approach which shows how the logistic regression is calculated!

    • @statquest
      @statquest  4 роки тому +1

      Thank you so much!!! I really appreciate your support. :)

  • @tanbui7569
    @tanbui7569 3 роки тому +6

    Josh, your videos can never give me a break. Always have something new to note down. I've been practicing Logistic Reg and Deep Learning for a while but I never knew Logistic Reg is grounded in log(odds). I do not think any ML/DL books or courses actually cover log(odds), at least in all that i've read/studied. They only mentioned the activation function which is the sigmoid. Thank you for the awesome video as always.

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @miamyos
    @miamyos 3 роки тому +5

    Hi, I just wanted to say thank you for these videos. I have tried to help a friend with this for a while because her teacher isn't very good at explaining things and it's been hard because it's been almost a decade since I took a statistics course. Your videos are extremely helpful and your explanations are so good, I think both me and my friend have learned more in two videos than we have during the entirety of her course. Thank you ❤

    • @statquest
      @statquest  3 роки тому +1

      I'm glad my videos are helpful! :)

  • @thej1091
    @thej1091 5 років тому

    I was doing the DATA SCIENCE CAPSTONE COURSE on linear regression! horrible teaching! Man you killed it! I finally understand log odds and how it manifests as log odds ratio! thank You! Statquest! The comparison to linear regression! My god! Great stuff and Great teaching!

  • @MichalJablonski-xe2ht
    @MichalJablonski-xe2ht 8 місяців тому +1

    Josh you are a teaching Rockstar the size of Walter Levin. Thank you so much! Just bought the book. Loving it!

    • @statquest
      @statquest  8 місяців тому

      Thank you so much! :)

  • @WalyB01
    @WalyB01 4 роки тому +3

    Statquest =infinite positivity

  • @gauravk4050
    @gauravk4050 5 років тому +16

    These videos totally help in getting a glance again from starting!! when you study so deep that you forget where you started1!
    totally solves Occam's razor problem!!BAM

  • @badoiuecristian
    @badoiuecristian 4 роки тому +1

    This litterally could not have been any clearer. I have now 0 questions about this topic. Amazing teaching skills.

  • @Dr.HusseinEid
    @Dr.HusseinEid Рік тому +2

    Hi josh i am from somalia your videos helped me to understand logist regressions , your explanation is very good thank you😊

  • @wolfisraging
    @wolfisraging 6 років тому +14

    Finally we have an awesome tutorial on UA-cam on this topic.
    Big fan😊

    • @statquest
      @statquest  6 років тому

      Hooray!!! I'm glad you like this video so much! :)

  • @ciancian2861
    @ciancian2861 3 роки тому +1

    thank you! you really are a great teacher, there is nothing like a teacher who can explain something in such a simple way!

  • @mchakra
    @mchakra Рік тому +1

    Josh you're explanations are spot on!

  • @jesusalbertoperezguerrero2560
    @jesusalbertoperezguerrero2560 3 роки тому +3

    Thank you so much! You're one of the coolest and most talented teachers I've ever had!

  • @jamemamjame
    @jamemamjame 6 років тому +4

    I really love the text that displayed when you explained.
    I'm not good in English. Sometime I cannot catch up the word you said but I still clearly understand your explanation because of helping from that text.
    Thank you.

    • @statquest
      @statquest  6 років тому

      Hooray! I'm glad you like the text. :)

  • @هشامأبوسارة-ن7و
    @هشامأبوسارة-ن7و 8 місяців тому

    A very informative video. A slight observation re.terminology. In any regression modelling, Let’s take your example of predicting size of a mouse using weight as a predictor, what you’re actually predicting is not the size of a mouse, but the expected size of a mouse given its weight E[size /weight] = intercept + b0*weight. It’s crucial to introduce that probabilistic reasoning at an early stage. The idea that we’re trying to predict the expected value of the target variable given the value of a predictor, independent variable, betrays the method that we want to find the regression coefficient of weight that would render observing the size most likely, hence the famous MLE - Maximum Likelihood Estimation.

  • @Niceperson7428
    @Niceperson7428 4 роки тому +1

    I all the time watching the videos of StatQuest and just wondering for two things:
    1) Why there is no actual course, after which you can get a certificate as the teaching methods, explanation and instructor are awesome?
    2) There are few, but still there are some people who click on 'dislike' sign to the video. I was wondering why, as there is no doubt that the instructor is explaining like for kids. For example, I have checked the PCA model for data reduction and StatQuest's explanation was the only one from which I finally got the idea of what is going on.
    So, BIG thanks to StatQuest...

    • @statquest
      @statquest  4 роки тому +2

      Thank you very much! I would love to make an actual course - and maybe one day I will. Right now I'm spending all of my time making videos - that's my favorite part, so that's what I do. :)

    • @Niceperson7428
      @Niceperson7428 4 роки тому +1

      @@statquest while no course, will follow your videos :)

  • @jiayiwu4101
    @jiayiwu4101 3 роки тому +5

    In summary, there are five graphs/function within logistic regression.
    1. p-odds: y = odds = p/(1-p). this is a part of inverse function p belongs to [0,1] and odds belongs to [0,inf). Two parts are p1/2
    2. odds-log(odds): y = log(odds) this is a normal log function where odds belongs to [0,inf) and log(odds) belongs to (-inf,+inf). Corresponding two parts are 0

  • @leavonfletcher4197
    @leavonfletcher4197 3 роки тому +2

    Greetings from the University of Texas at Austin! I am going through a Machine Learning class and your classes are totes useful! Thanks!

  • @katharinastck
    @katharinastck 7 місяців тому

    I love the way you explain complex concepts in such a simple and understandable way! I'm currently doing a professional training in machine learning and many things that are just brought to me by complex equations are so much easier to grasp with your videos!
    If I could wish for something, it would be some additional Python videos. You do a lot with R, but I feel like Python is requested more often in the non-academic world.

    • @statquest
      @statquest  7 місяців тому

      Thanks! I'm starting to add more python videos.

  • @ranfuchs3592
    @ranfuchs3592 4 роки тому +2

    Brilliant and clear. Makes a relatively complex topic really simple. Thank you

  • @aryapranahutama6569
    @aryapranahutama6569 5 років тому +1

    I see why you made a wonderful video about statistics. It explained by the song in the beginning of this video!! You made it by joy and passion. Thanks so much for your videos

  • @mrstudent1957
    @mrstudent1957 4 роки тому +3

    i recommend your channel whenever people ask me where did i study from

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much! Sharing my videos is the best complement. :)

  • @eamonnca1
    @eamonnca1 4 роки тому +1

    Our prof speaks broken English and does not explain this stuff very well. You are a life-saver!

  • @kunalramchurn4700
    @kunalramchurn4700 2 роки тому +1

    Mr Statquest, you are the best sir!

  • @joseamaldonadomartinez480
    @joseamaldonadomartinez480 4 роки тому +4

    I learned a lot watching your logistic regression playlist!
    ! Thanks for making these videos!

  • @tallwaters9708
    @tallwaters9708 6 років тому +3

    Good man! It's easy to underestimate how much is in logistic regression.

  • @pris3675
    @pris3675 4 роки тому +2

    I JUST LOVE UR DRY HUMOR AND UR FUNNY INTROS. just watched the vids today and uve gained urself a new subscriber!! love from singapore :D

    • @statquest
      @statquest  4 роки тому

      Hooray! Thank you so much! :)

  • @zukofire6424
    @zukofire6424 2 роки тому +1

    Thanks Professor Josh Harmer (I have a presentation tmrw!!) sending gratitude and good vibes!

    • @statquest
      @statquest  2 роки тому +1

      Best of luck!

    • @zukofire6424
      @zukofire6424 2 роки тому +1

      @@statquest I passed! ^^

    • @statquest
      @statquest  2 роки тому +1

      @@zukofire6424 TRIPLE BAM!!! Congratulations :)

  • @nurwani556
    @nurwani556 4 роки тому +1

    Such a good, simple, clear video explanation!

  • @MrPainfulTruth
    @MrPainfulTruth Рік тому

    The critical part is the link between a normal variable, the probability and the log odds results. You explained a bit of it and i didnt see any better video, but i'd lie if i said i understsand that step already to a degree that i could explain it to someone myself.

    • @statquest
      @statquest  Рік тому +1

      For more details on that step, see: ua-cam.com/video/BfKanl1aSG0/v-deo.html

  • @alexz7432
    @alexz7432 5 років тому +10

    This is the best song I heard from watching your channel so far :D Anyway, your clear explanation is awesome. Keep up the great work!

    • @statquest
      @statquest  5 років тому +2

      Thank you so much! :)

  • @shrikantagrawal8239
    @shrikantagrawal8239 5 років тому +1

    Triple Bam!!! Thanks a lot Josh.This really clears up a lot of confusions.

  • @Laura-up2rm
    @Laura-up2rm 3 роки тому +1

    discover this channel is the best thing happened to me!!! GRACIAS!!!!

  • @manabou5790
    @manabou5790 3 роки тому +1

    The best explanation ever. Thank you Sir.

  • @rrrprogram8667
    @rrrprogram8667 6 років тому +112

    One request... All the videos available are randomly watched.... But can you suggest the sequence to watch.. So that the content becomes more structured for the learning path of ML

    • @wizeguy9285
      @wizeguy9285 6 років тому +15

      You can go to the playlists option on the home page and watch them in order by topic

    • @XYZmmc
      @XYZmmc 5 років тому

      @@wizeguy9285 but there will not be correct order

    • @TheIsrraaa
      @TheIsrraaa 5 років тому +4

      OMG find a full path on Internet for ML. Statquest is one of the best channels on UA-cam but ML its much more than just a playlist to follow.

    • @thulasirao9139
      @thulasirao9139 4 роки тому

      Go to play list you can see in order. You need to subscribe for that

  • @marcoventura9451
    @marcoventura9451 Рік тому +1

    Thank You for the beautiful and relaxing videos.

    • @statquest
      @statquest  Рік тому +1

      TRIPLE BAM!!! Thank you so much for supporting StatQuest!!! :)

  • @a.tanveer9663
    @a.tanveer9663 3 роки тому +1

    Fantastic explanation, as always. For anyone looking for a deeper-diver into the math, chapter 4 from 'An introduction to Statistical Learning' is highly recommended.

  • @kayceeprag
    @kayceeprag Рік тому +1

    Thanks Josh. I’m hooked. MS Data Analytics & Viz. in view 🙏🏿

  • @vinodaxisful
    @vinodaxisful 5 років тому +8

    Hi. I am trying hard to find out the calculation for arriving at the standard error of the co-efficient and the intercept. Would be helpful if this can be shared.

  • @rhlongwane6575
    @rhlongwane6575 6 років тому +9

    Best logistic regression tutorial.

    • @statquest
      @statquest  6 років тому +2

      Thanks so much!!! I'm really happy to hear you like this one. I was worried it would be too obscure.

    • @rhlongwane6575
      @rhlongwane6575 6 років тому +2

      StatQuest with Josh Starmer It is not obscure at all; just perfect.

    • @statquest
      @statquest  6 років тому

      Thank you! :)

  • @noname-jo7lz
    @noname-jo7lz 5 років тому +2

    You saved my Day, best videos 100stars 4 u. Even though my test will be in german, I understood it in english better. Thank u!!

  • @mahmoudmoustafamohammed5896
    @mahmoudmoustafamohammed5896 3 роки тому

    All videos in your channel are awesome and really clear explained. The only problem is that there is almost no real order to the videos so each topic clusters specific videos. That would be super amazing if you may add them in order.

    • @statquest
      @statquest  3 роки тому +1

      They are all organized here: statquest.org/video-index/

    • @mahmoudmoustafamohammed5896
      @mahmoudmoustafamohammed5896 3 роки тому

      @@statquest That's super cool..Thank you soo much :))))

    • @statquest
      @statquest  3 роки тому

      @@mahmoudmoustafamohammed5896 Also, check out: app.learney.me/maps/StatQuest

    • @mahmoudmoustafamohammed5896
      @mahmoudmoustafamohammed5896 3 роки тому +1

      @@statquest oh cool...so fancy ...Thank you for all your awesome efforts :)))))))

  • @StoicRichie
    @StoicRichie 3 роки тому +1

    Thank u so much for helping me in my journey of data .

  • @harshchovatiya-gh7hn
    @harshchovatiya-gh7hn 6 років тому +1

    omg...your songs are awesome. i just visited your website and listening full songs.its awesome.So many Bamsss

    • @statquest
      @statquest  6 років тому

      Thanks so much!!! I'm really glad to hear you like the songs. :)

  • @hajer3335
    @hajer3335 6 років тому +1

    best explain ever. you have ability to make us so interested in stat. this channel is just my favorite. THANK YOU.
    i want to know if you have a book in this topic or an article?!

    • @statquest
      @statquest  6 років тому +1

      Hooray! I'm so glad to here you like the video!!! :) However, all I have are the videos - no book or articles.

  • @sheilaquan9824
    @sheilaquan9824 4 роки тому +5

    One of my fav sources of education now! ( this song .......)

  • @ethiopiantech
    @ethiopiantech Рік тому +1

    3:08 I can't stop laughing at your hilarious voice tone when you say "shameless self-promotion"! It's cracking me up!

  • @sachu720
    @sachu720 4 роки тому +3

    Hey Josh, awesome stuff.Landed on your channel after 3b1b ....Quadruple BAM !!!

    • @statquest
      @statquest  4 роки тому +1

      Hooray!!! I'm glad you like my stuff! :)

  • @shnokoiek3528
    @shnokoiek3528 4 роки тому +1

    You are absolutely an amazing stat teacher :)

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @PunmasterSTP
    @PunmasterSTP 7 місяців тому +1

    I can safely say that StatQuest is super-logit!

  • @Opkopen
    @Opkopen Рік тому +1

    ❤ this channel. Completed my understanding...

  • @drachenschlachter6946
    @drachenschlachter6946 Рік тому +2

    Triple Bam!! Good video!!!! Greets from. Germany 🇩🇪😍

  • @henrikbuhl9403
    @henrikbuhl9403 4 роки тому

    Good video and good content as usual. However i feel i am still one step away from truly understanding what do the "Coefficients" actually mean, that is to say none of us think in chance in terms of log(odds). I assume, and googled, and i get the impression that using e^x with log odds will give me probability. And in your explanation i guess it means for the continuous variable that by increasing X by 1 the log(odds) increases by said coefficient.
    I appreciate your work and good teaching skills, also your singing is, surprisingly, good.
    Update: UA-cam suggested a video you made on Log odds which has taken me further on understanding this log odds concept. The final stepping stone i needed to understand it was that both odds and probability can be defined using simply succesful outcomes and unsuccessful outcomes.

    • @statquest
      @statquest  4 роки тому

      I'm glad you figured it out! :)

  • @pranaymehta7958
    @pranaymehta7958 5 років тому +2

    Hey Josh, thanks for the simple and clear explanation about Logistic Regression. I have a question about this technique - where and how do we include the hyper parameter terms for the Logistic model ? Also, what are the implications of various hyper parameters when we add it to the loss terms? I think it would be really nice if you can explain the Lasso and Ridge techniques with this intuition of Logistic Regression. Thanks :)

  • @rrrprogram8667
    @rrrprogram8667 6 років тому +3

    I will watch this many times till it sinks in my head

  • @dolee7257
    @dolee7257 4 роки тому +1

    I think your channel is awesome! Thanks for doing this!

  • @eajaykumar
    @eajaykumar 4 роки тому +1

    OMG Josh, after watching your videos one by one that is Logs ,Odds and Log(Odds). Logistic Regression is just tip of iceberg Double BAMMM!. ( StatQuest - youtube infinity = StatQuest Infinity)

  • @CapsCtrl
    @CapsCtrl Рік тому

    these intros are something else

  • @haroldfelipezuluagagrisale3875
    @haroldfelipezuluagagrisale3875 4 роки тому

    Thanks for this rich content, best educational video about machine learning, youre the best!!!

  • @yzadil
    @yzadil 3 роки тому +1

    Statistics with fun! I like it a lot. Thanks.

  • @joxa6119
    @joxa6119 2 роки тому +1

    I think the problem of my course is, it doesn't use the terms such as Wald's test, Fisher's Exact Test, but it only showed in mathematical terms and symbol. At first, I feel like I never learned it before, but when I revised it, I actually has learned this. I don't know why my uni syllabus doesn't use this global statistics terms in the course. They could be useful especially when the learning phase has entered machine learning phase.

  • @kavuruvamsikrishna02
    @kavuruvamsikrishna02 4 роки тому +1

    excellent Josh

  • @ashishmehra5143
    @ashishmehra5143 11 місяців тому

    Folks who do not understand the standard error:
    It identifies the variation in the sample. It tells how much the sample mean will vary from the population mean.
    Standard error is used to calculate the Confidence Interval.
    Standard error is inversely proportional to sample size, that is s.e. ∝ 1/n
    The formula is: s/√n
    where:
    s = Sample standard deviation
    n = Sample size

    • @statquest
      @statquest  11 місяців тому

      Noted

    • @ashishmehra5143
      @ashishmehra5143 11 місяців тому

      @@statquest Could you please also add a note how this standard error is calculated for the intercept and other variables in the given example?

    • @statquest
      @statquest  11 місяців тому

      @@ashishmehra5143 If you want to know more about how the standard error is calculated, see: ua-cam.com/video/8nm0G-1uJzA/v-deo.html

  • @AnahideCastro
    @AnahideCastro 4 роки тому +2

    It´s just amazing! Thank you very much! It´s funny and accurate. Your classes are inspiring.

  • @karthik-ex4dm
    @karthik-ex4dm 6 років тому +2

    Why do I hit "like" in every statquest?? I have liked all the videos i've seen in this channel....
    I really don't know why.... Great work Josh

    • @statquest
      @statquest  6 років тому +1

      Thank you so much!!! :)

  • @emsif
    @emsif 7 місяців тому

    thank you a lot for this great explanation. I have two questions: At min 6 you put 0.5 into the logit formula and get 0 as the new center of the x-axis. So far so good. After that you choose 0.731 and get 1 as the new value for the logit formula. My first question is , from where does the value 0.731 and the other values come from ? And my second question, how do you come to the result of 1 when you put 0.731 into the logit formula? Did i miss something?

    • @statquest
      @statquest  7 місяців тому

      1) We can translate 'p' into the log odds with the function log(p / (1-p)) and we can invert that equation to translate the log odds into 'p' with p = exp(log(odds) / (1 + exp(log(odds))). So I plugged log(odds) = 1 into that second equation to get the value for 'p', which was 0.731. To learn more about how these equations are related, see: ua-cam.com/video/BfKanl1aSG0/v-deo.html
      2) log(odds) = log(p / (1 - p)) = log(0.731 / (1 - 0.731)) = 1. NOTE: We're using the log base 'e' because in statistics, machine learning and most programming languages, the default base for the log() function is 'e'. In other words, when I write, "log()", I mean "natural log()", or "ln()". Thus, the log to the base 'e' of 2.717 = 1.

  • @sallywang9894
    @sallywang9894 4 роки тому +1

    thank you so much for making this series of vidio! helps a loooot

  • @tanvirrajput3906
    @tanvirrajput3906 5 років тому +1

    Brilliantly articulated

  • @Tyokok
    @Tyokok 5 років тому +3

    Josh, need bother you again. two questions: 1) 6:15 you map probability [0,1] to logs(odds), but how do you get the probability for each observation at the first place? 2) once you map probability to log(odds) 6:15 and 15:32, so y-axis is log(odds), how do you interpret it? at 15:32, you put "size" as y-axis, is that how interpret log(odds)? isn't it here the possibility to have obese? Thanks a lot in advance!

    • @statquest
      @statquest  5 років тому

      Let's start by just making sure the definition of the two axes are clear: At 6:15, I'm showing how different probabilities map to the log(odds) axis. So p=0.5 translates to log(odds) = 0, p=731 translates log(odds) = 1. Thus, each point on the probability axis translates to something on the log(odds) axis.
      OK, now that we have that part clear, let's talk about the probability that each mouse is obese. At 1:37 in the video I say that the blue dots represent obese mice and the red dots represent mice that are not obese. So the probability that blue dot mouse is obese = 1, since the blue dots represent obese mice. The probability that a red dot mouse is obese = 0, since the red dots represent mice that are not obese. Does that make sense?
      As for the log(odds) axis, this represents the change in the log(odds) of obesity for every unit change in weight (or in genotype). So, in the weight vs obesity example, if you have a mouse that is one unit heavier than another mouse, then the log(oods) increases by 1.83. Does that make sense?

    • @Tyokok
      @Tyokok 5 років тому +2

      @@statquest Great Thanks for confirming the log(odds) axis. But I am still unclear about the first question: so in 1:37 and 6:15, your red dot have probability 0 (to have obese), and blue dots have probability 1. But at 6:15, how do you get fraction probabilities (those p=0.73, p=0.88)? are they the result from your logistic regression curve?

    • @statquest
      @statquest  5 років тому +1

      @@Tyokok Oh, I think I see the confusion. p=0.73 and p=0.88 are not from the data or the curve. They are just example probabilities, values between 0 and 1, that I use to show how a point on the y-axis (probability) for the logistic regression curve relates to a point on the log(odds) axis. In other words, I just wanted to show how the formula log(p/(1-p)) = log(odds) worked, so I picked some numbers and plugged them in. Even though I could have picked any number between 0 and 1 for the demonstration, I picked p=0.73 and p=0.88 because I knew they would translate to nice, round numbers on the log(odds) axis. Does that make sense?

    • @Tyokok
      @Tyokok 5 років тому +2

      @@statquest Yes that's what I learnt from your video. What I don't understand is that when you mapping the observations to log(odds), ( since observations are binary, either p=0 or p=1 probability,) your log(odds) will end up with only position and negative infinity value in log(odds) space. Then it's not a slope line. Or I am missing something?

    • @statquest
      @statquest  5 років тому

      @@Tyokok One use of the transformation from probability to log(odds) is once you fit a curve to the data, someone can tell you that they have a mouse that weighs "x".... You can then plot it on the graph (along the x-axis) and use the curve to see what the probability is that that mouse is obese. You can the use the log(p/(1-p)) transformation to say what the log(odds) are that that mouse is obese. Does that make sense?

  • @deepakmehta1813
    @deepakmehta1813 3 роки тому

    Thank you Josh, once again a great video. In the example there are 2 coefficients: intercept is not statistically significant and geneMutant is statistically significant. My question is how to check whether intercept and geneMutant together are statically significant with size or not.

    • @statquest
      @statquest  3 роки тому

      I don't think there is any sense in asking if both the intercept and geneMutant together are statistically significant. The fact that the intercept is not statistically significant simply means that the intercept could be 0. 0 is still a very valid value for the intercept, so it is still in the model, even if it is 0.

    • @deepakmehta1813
      @deepakmehta1813 3 роки тому +1

      @@statquest Thank you Josh

  • @jongcheulkim7284
    @jongcheulkim7284 4 роки тому +1

    Thank you. This is very helpful.

  • @biancamanago3214
    @biancamanago3214 5 років тому +1

    This is an excellent resource. Thank you!

  • @anarkazimov4206
    @anarkazimov4206 3 роки тому +1

    Much love from Azerbaijan🇦🇿

  • @lucilec.8494
    @lucilec.8494 4 роки тому

    Thank you so much for your videos, it has helped me loads to finally understand what tests to use and how to interpret them!
    I have a couple of quick questions though for the analysis I am trying to do atm: the variable I am using as a predictor is a continuous numeric variable but it ranges from negative to positive values. 1) do I need to scale and centre the variable prior to analysis? 2) If my lowest value is x= -20, when reading the coefficient estimate for the intercept, does it indicate the odds of belonging to group A when x= -20 or when x=0? Many thanks and well done again for all your tutorials!

    • @statquest
      @statquest  4 роки тому +1

      1) No
      2) The intercept is the y-axis intercept, so that is when x = 0.

  • @scoppyeah
    @scoppyeah 5 років тому +1

    Best explanation 👍

  • @rrrprogram8667
    @rrrprogram8667 6 років тому +3

    Hey josh... I was asked for fill the feedback form for datacamp site.... I have mentioned.. Machine learning have to be taught the way statquest teaches...

    • @statquest
      @statquest  6 років тому

      You're the best!!! Thank you!!!! :)

    • @rrrprogram8667
      @rrrprogram8667 6 років тому +1

      StatQuest with Josh Starmer Thanks for all your great videos... Hope to see more videos... MEGAA BAMMM

    • @statquest
      @statquest  6 років тому +1

      There should be another one coming out today and a week from today. My goal is 3 a month. :)

  • @ratnakarbachu2954
    @ratnakarbachu2954 3 роки тому +1

    love u brother , god bless u.
    u r rock.