03 Numerical Integration

Поділитися
Вставка
  • Опубліковано 14 січ 2025

КОМЕНТАРІ • 9

  • @gageshmadaan6819
    @gageshmadaan6819 3 роки тому

    Finally a great tutorial on Bayesian Quadrature. Thank you..!!

  • @nullpter
    @nullpter 4 роки тому +1

    Thank you so much for the tutorial!

  • @matej6418
    @matej6418 Рік тому

    elite content

  • @katateo328
    @katateo328 2 роки тому

    shifting with rotation the nodes a little bit would create other weak estimators.

  • @katateo328
    @katateo328 2 роки тому

    yeah, Gaussian quadrature is exactly the boosting concept.

  • @danel5726
    @danel5726 3 роки тому

    Thank you so much for this tutorial! It's the only resource i have found yet with an intuitive example and explanation to BQ.
    I have a question regarding the variance of Z in BQ.
    There are 3 x's being considered when estimating the variance, with X being the training data, and then x, and x' as well.
    if k(x,x') is a kernel function for any two given inputs, how can one picture k(x,X) and k(X,x')?
    Thank you again!

    • @willtsing3000
      @willtsing3000 3 роки тому

      Hi, am also a little bit confused by the fact that there are two input (x, x'), which is writing separately here as k(x,X) and k(X, x')
      was thinking maybe another formulation written as $V = E_{x}k(x, x) - E_(x)k(x,X)K^{-1}k(x,X)^T$ could be little bit more clear, where the 2 input: x, and x' are all concatenated in the input x, what do you think?
      And the diag part of E_{x}k(x, x) should be exactly equal to the original k(x, x), is this understanding correct?

  • @katateo328
    @katateo328 2 роки тому

    dont you think we can improve the performance of trapezoidal method by using boosting concept?? trapezoidal is a weak estimator. Boosting weak estimators may have as well performance as Simpson method.

  • @katateo328
    @katateo328 2 роки тому

    hahaha, generally, lazy is goooooood.