Cross-Entropy Loss Log-likelihood Perspective

Поділитися
Вставка
  • Опубліковано 3 лют 2020
  • This is a video that covers Cross-Entropy Loss Log-likelihood Perspective
    Attribution-NonCommercial-ShareAlike CC BY-NC-SA
    Authors: Matthew Yedlin, Mohammad Jafari
    Department of Computer and Electrical Engineering, University of British Columbia.
  • Наука та технологія

КОМЕНТАРІ • 14

  • @vaibhavarora9408
    @vaibhavarora9408 4 роки тому +4

    Great lectures. Thank you Matt Yedlin.

  • @victorialeigh2726
    @victorialeigh2726 3 роки тому

    I really like that you guys hand clean the clear board. Feels like back in the classroom + math teather's small math talk!

  • @yanbowang4020
    @yanbowang4020 4 роки тому +2

    love your video hope to see more of it.

  • @yjy8
    @yjy8 3 роки тому

    you just cleared my doubt about likelihooh and log likelihood which I had for past yaer and half.
    Thank you so much

  • @bingbingsun6304
    @bingbingsun6304 3 роки тому +1

    Well explained using human language.

  • @TylerMatthewHarris
    @TylerMatthewHarris 3 роки тому

    thanks!

  • @AChadi-ug9pg
    @AChadi-ug9pg 3 роки тому +1

    Muhammad is a good student

  • @davidlearnforus
    @davidlearnforus 2 роки тому

    Hi, thank you so much! I am self lerner, with no much of formal background. Can you please explain how SUM p_i log q_i is entropy, because it does not have minus sign. If it would be log (1/q_i) we would get minus sign out of it but its not. I'm stuck there...

  • @peizhiyan2916
    @peizhiyan2916 4 роки тому

    Nice

  • @bingbingsun6304
    @bingbingsun6304 3 роки тому

    I want to know how you can write on the mirror and record it.

    • @mattyedlin7292
      @mattyedlin7292  3 роки тому +1

      The camera sees the writing flipped through a mirror.

  • @garrettosborne4364
    @garrettosborne4364 2 роки тому

    Can the old guy.

  • @jimbobur
    @jimbobur Рік тому +1

    *(EDIT: Solved it, see comment reply)*
    I don't follow how you go from the case of numerical example, where the likelihood is a product of predicted and observed probabilities p_i and q_i each raised to the number of times they occur, to the algebraic expression of the likelihood where you take the product of q_i raised to N * p_i (or is that N_p_i? I'm a little unsure if the p_i is a subscript of the N or multiplied by it).

    • @jimbobur
      @jimbobur Рік тому +2

      I worked it out. The answer is to remember that the number of times the outcome i, with probability p_i occurs can be expressed by rearranging the definition p_i = N_p_i / N and substituting this into the expression for the likelihood in the general form that follows from the numerical example:
      L = Π q_i ^ N_p_i ,
      giving
      L = Π q_i ^ N*p_i