Logistic regression 5.2: Multiclass - Softmax regression

Поділитися
Вставка
  • Опубліковано 22 сер 2024
  • Full video list and slides: www.kamperh.co...
    Errata:
    1:50 - Each of individual output probabilities depend on all the weights W, not just the weights of that element. So the first output probability doesn't only depend on w_1, but on W. And similarly for all the other elements.

КОМЕНТАРІ • 24

  • @blessontomjoseph
    @blessontomjoseph 2 роки тому +4

    the whole logistic regression playlist is o good. great job!

    • @kamperh
      @kamperh  2 роки тому

      Very happy it helps!

  • @gautamkulkarni7049
    @gautamkulkarni7049 Рік тому

    Thank you for explaining the whole concept in a much simpler way.

  • @DimiqBaba
    @DimiqBaba 9 місяців тому

    Your videos are quite easy to comprehend. Good job.

  • @Nanona6
    @Nanona6 3 роки тому +3

    Thank you very much, it helped me a lot for my university classes. Very clear explanation. :)

    • @kamperh
      @kamperh  3 роки тому +1

      Very happy it helped!! :D

  • @catdonut6696
    @catdonut6696 2 роки тому +1

    Very clear and concrete explanation! Thank you so much.

    • @kamperh
      @kamperh  2 роки тому

      Very big pleasure! :D

  • @norebar5848
    @norebar5848 Рік тому +1

    What an awesome video, thank you! I made sure to watch lots of ads for you, even though it probably doesnt make a difference :D

    • @kamperh
      @kamperh  Рік тому

      Thanks for the encouragement!! :)

  • @user-bn7ms7it2c
    @user-bn7ms7it2c 4 місяці тому

    love it! so well explained and love the mathmetical proof

  • @mohamadalipour2563
    @mohamadalipour2563 11 місяців тому

    thank you for the explanation,
    Better than the professor at University ! ...

  • @ptnpdrs2001
    @ptnpdrs2001 5 місяців тому

    Fantastic explanation, thanks a lot Herman!

  • @lima073
    @lima073 2 роки тому +2

    This is a remarkable explanation, simple and complete, thank you very much !

  • @arvindkanesan
    @arvindkanesan 2 роки тому +1

    Very well explained!

  • @somnathchatterjee9601
    @somnathchatterjee9601 2 роки тому +1

    Great work man!!

  • @amgadshrief7215
    @amgadshrief7215 7 місяців тому

    Very good Explanation :)

  • @Charles_Reid
    @Charles_Reid 4 місяці тому +1

    Thanks, this is a very helpful video. One question, in the video you mentioned that since probability is between 0 and 1 and probabilities sum to 1, you need to raise e to the power of each score and divide by the sum of the scores to obtain a probability. Is there a reason that you choose e as the base of the exponent? Why not choose another number? My confusion is that if I chose a number like 10 as the base, I'm pretty sure my softmax model would classify everything the same as if I had chosen e, but that the probabilities calculated would be different. I'm wondering if softmax is actually returning the real probability, or just a number between 0 and1 that behaves like the real probability. Thanks!

    • @kamperh
      @kamperh  4 місяці тому +1

      This is a really good question that I hadn't thought about before. First, using base 10 will probably work fine because of all the reasons you say. If you were training a neural network, you could probably use any number and the network would just adjust the logits to do what it must do. I see there are some practical reasons to use e: forums.fast.ai/t/why-does-softmax-use-e/78118 And finally I want to ask tongue-in-cheek: What does it mean when you say "real probability"? : ) No one knows the real probability except the Creator, and all we're doing is trying to model it ;)

    • @Charles_Reid
      @Charles_Reid 4 місяці тому +1

      @@kamperh Yeah maybe the "real probability" can only be 0 or 1, as the data point either does belong to the class or does not. But we don't know which class it belongs to, so SoftMax gives us a probability that is different from the so-called "real probability" but that helps us make a guess. Thank you for your help!

  • @pawestelmasik9998
    @pawestelmasik9998 9 місяців тому

    Great job

  • @mrmashup2885
    @mrmashup2885 3 роки тому

    nicely explained sherlock homes

    • @kamperh
      @kamperh  3 роки тому

      I'm a big fan of the Sherlock books and series, so I'll take it as a compliment :P

  • @israalewaa1480
    @israalewaa1480 3 роки тому

    Prof. Herman, i have an issue for this can i sent to your email please ?