(ML 11.5) Bias-Variance decomposition

Поділитися
Вставка
  • Опубліковано 30 чер 2011
  • Explanation and proof of the bias-variance decomposition (a.k.a. bias-variance trade-off) for estimators.

КОМЕНТАРІ • 14

  • @Tyokok
    @Tyokok 2 роки тому +5

    Thanks for the great video! 3 question,
    1) could you please explain a bit more what is theta here? Is it predict function or just params of predict function?
    2) and what does it exactly mean conditional on give theta?
    3) 4:30 what is this mu= expectation of theta hat doing expectation over of?
    Many Thanks!

  • @Twilightsfavquill
    @Twilightsfavquill 11 років тому +1

    Very well explained. Thanks!

  • @plttji2615
    @plttji2615 2 роки тому

    Thank you for the video, can you help me how to prove that is unbiased in this question? Question: Compare the average height of employees in Google with the average height in the United States, do you think it is an unbiased estimate? If not, how to prove it is not matched?

  • @yazhouhao7086
    @yazhouhao7086 5 років тому +2

    I think \theat is not a constant, it's just deterministic.

  • @basantmounir
    @basantmounir 3 роки тому

    Elegant. ❤

  • @tz7380
    @tz7380 Рік тому

    great video!

  • @ChoponHill
    @ChoponHill 2 роки тому

    dang! this is tight.

  • @tumul1474
    @tumul1474 5 років тому +1

    Thanks a bunch sir you just saved my ass !!

  • @nprasenreddy
    @nprasenreddy 10 років тому +1

    neat

  • @tymothylim6550
    @tymothylim6550 3 роки тому

    Thank you very much for this video :) I learnt something from it :)

  • @alexanderyau6347
    @alexanderyau6347 4 роки тому

    neat and elegent

  • @youtubecommenter5122
    @youtubecommenter5122 3 роки тому +3

    I'm just here to read all the Asians commenting

  • @fzhcary
    @fzhcary 7 років тому

    nice tutorial. a small error at 8': the first term should be bias and 2nd term is variance.