What is the difference between negative log likelihood and cross entropy? (in neural networks)

Поділитися
Вставка
  • Опубліковано 22 сер 2024
  • Full video list and slides: www.kamperh.co...
    Introduction to neural networks playlist: • Introduction to neural...
    Yet another introduction to backpropagation: www.kamperh.co...

КОМЕНТАРІ • 6

  • @harshadsaykhedkar1515
    @harshadsaykhedkar1515 2 місяці тому +1

    This is one of the better explanations of how the heck we go from maximum likelihood to using NLL loss to log of softmax. Thanks!

  • @allantourin
    @allantourin 8 місяців тому +3

    Thanks Herman. I'm following some pytorch tutorial and got lost when i saw the cross entropy computation equal the NLL one. This definitely filled the gap

    • @kamperh
      @kamperh  8 місяців тому +2

      Very happy this helped!! :D

  • @kundanyalangi2922
    @kundanyalangi2922 3 місяці тому

    Good explanation. Thank you Herman

  • @AngeloKrs878
    @AngeloKrs878 Рік тому +1

    Thanks for the video

  • @martinpareegol5263
    @martinpareegol5263 Місяць тому

    Why is it prefered to solve the problem as minimize the cross entropy over minimize de NLL? Are there more efficient properties doing that?