CS480/680 Lecture 12: Gaussian Processes

Поділитися
Вставка
  • Опубліковано 21 чер 2019

КОМЕНТАРІ • 12

  • @jagmohanfanshal3599
    @jagmohanfanshal3599 2 роки тому +3

    I have been trying to learn GP for a few days and I believe this is the best explanation I have seen so far. Thank you!

  • @ApiolJoe
    @ApiolJoe 2 роки тому

    The clarity of explanations is great! I will definitely come back to this channel for other topics. Thanks a lot for sharing :)

  • @srinivasanbalan2469
    @srinivasanbalan2469 5 років тому +3

    Good explanation Dr. Pascal. Really thankful

  • @youngzproduction7498
    @youngzproduction7498 2 роки тому

    Finally, I can understand it! Thanks a lot.

  • @chunyangxiao8443
    @chunyangxiao8443 3 роки тому +4

    I really like the explanations.
    PS: It seems that there was a very minor error at the end. bigger sigma_f^2 means we model the problem with larger variance; and if we take the formula in the previous slide, we would "explore" more vs "exploit", which turns out to work well for this problem. The smoothness is more directly controlled by the analytical form of x, x' (e.g. a gaussian kernel is smoother than the laplacian one).

  • @nickbishop7315
    @nickbishop7315 Рік тому

    The algorithm suggested at the end of the talk (slide 22) seems very similar to GP-UCB!

  • @EngRiadAlmadani
    @EngRiadAlmadani 4 роки тому +1

    Nice

  • @SebastianBoecker
    @SebastianBoecker 10 місяців тому

    it is very sad that the crazy camera movements make it hard to listen to the lecture -- otherwise, fantastic work, thumbs up! :)

  • @piotr780
    @piotr780 Рік тому

    so x consists of ALL observations in our dataset ?