Categorical Cross - Entropy Loss Softmax

Поділитися
Вставка
  • Опубліковано 5 лют 2020
  • This is a video that covers Categorical Cross - Entropy Loss Softmax
    Attribution-NonCommercial-ShareAlike CC BY-NC-SA
    Authors: Matthew Yedlin, Mohammad Jafari
    Department of Computer and Electrical Engineering, University of British Columbia.
  • Наука та технологія

КОМЕНТАРІ • 24

  • @rickragv
    @rickragv 3 роки тому +14

    this is hidden gem! Thank you!

  • @abiolaadeye2961
    @abiolaadeye2961 3 роки тому +3

    Excellent Video Series. I love the question and answer format ! Thanks!

  • @veganath
    @veganath Рік тому +1

    Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!

  •  3 роки тому +3

    thank you guys! such a great explanation!

  • @bengonoobiang6633
    @bengonoobiang6633 2 роки тому

    Good Video. The course format make the course look so easy to understand.

  • @keiran110
    @keiran110 3 роки тому

    Great video, thank you.

  • @cedricmanouan2333
    @cedricmanouan2333 3 роки тому +1

    Great !

  • @himanshuvajaria6426
    @himanshuvajaria6426 2 роки тому

    Thanks guys!

  • @keshavkumar7769
    @keshavkumar7769 3 роки тому +1

    wonderfull

  • @g.jignacio
    @g.jignacio 4 роки тому +1

    Very good explanation! it's been so hard to find a numerical example. Thank you guys!

  • @hassanmahmood7284
    @hassanmahmood7284 2 роки тому

    Awesome

  • @MultiTsunamiX
    @MultiTsunamiX 4 роки тому +6

    At 6:44 there is a mistake in the equation, .715 should be in last log parenthesis instead of .357

  • @igomaur4175
    @igomaur4175 2 роки тому

    wowww

  • @rizalalfarizi9196
    @rizalalfarizi9196 4 роки тому +1

    thank you very much clear explanation, love it sir

  • @wesleymelencion3618
    @wesleymelencion3618 3 роки тому +1

    why were you using logarithm base 2 ?

  • @RitikDua
    @RitikDua 4 роки тому

    Very good explaination

  • @raymondchang9481
    @raymondchang9481 9 місяців тому

    how much is an intercontinental ballistic missle?

  • @shivamgupta187
    @shivamgupta187 3 роки тому +2

    if i am not wrong, you have used softmax function to normalize i.e. to sum up the probability to 1
    but in your examples it is
    .147+.540+.133+.180 = 1
    .160+.323+.357+.160 = 1
    .188+.118+.715+.079 = 1.1
    can you please help me to understand the above discrepancy

    • @horvathbenedek3596
      @horvathbenedek3596 3 роки тому

      You can see that they messed up, and wrote .188 instead of .088 when transferring from the softmax to the y-hat vector. I guess they added y-hat manually, resulting in the mistake.

  • @trajanobertrandlleraromero6579
    @trajanobertrandlleraromero6579 4 місяці тому

    Vine buscando cobre y encontré oro!!!!

  • @KuldeepSingh-cm3oe
    @KuldeepSingh-cm3oe 3 роки тому

    Brilliant

  • @BrandonSLockey
    @BrandonSLockey 3 роки тому +1

    batman and robin

  • @lucyfrye6723
    @lucyfrye6723 Рік тому

    It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.

    • @mattyedlin7292
      @mattyedlin7292  Рік тому

      Hello Lucy
      Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.