11f Machine Learning: Bayesian Regression Example

Поділитися
Вставка
  • Опубліковано 10 гру 2024

КОМЕНТАРІ • 27

  • @TejasEkawade
    @TejasEkawade 2 місяці тому

    This is a very clear explanation of the Bayesian Linear Regression. Thank you!

  • @IndigentMartian
    @IndigentMartian 4 роки тому +6

    Nice - very clear description of confidence vs credible intervals, thanks!

  • @MrDgketchum
    @MrDgketchum 2 роки тому

    Really liking the videos. Dove in head first to PyMC3 following the suggestion of an advisor. These videos are helping me understand what's going on under the hood, and why not to just use OLS on my messy data. Thanks!

  • @nb9797
    @nb9797 3 роки тому

    what a lovely series of lectures on this. Thank you !

  • @luishernangarciapaucar683
    @luishernangarciapaucar683 3 роки тому

    Very grateful, this has been a great explanation and motivation to explore MCMC and more Bayesian approaches!!

  • @garethedwards6781
    @garethedwards6781 3 роки тому +1

    Thanks for the videos! Your explanations are very clear and succinct.

  • @sourabhsharma9830
    @sourabhsharma9830 3 роки тому +1

    What an explanation SIR 🙏🙏🙏🙏

  • @jiey2271
    @jiey2271 4 роки тому +1

    Hi thanks for the great lectures! I really like the way you explain things. Can you talk some about glm?

  • @WahranRai
    @WahranRai 4 роки тому +2

    Could you show the code or how you get these sampling etc...

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  4 роки тому +1

      Check out the link in the description. You'll need to install the pymc3 python package for the McMC sampling.

    • @WahranRai
      @WahranRai 3 роки тому +2

      @@GeostatsGuyLectures Thank you and happy new year !

  • @cahalmurphy
    @cahalmurphy 3 роки тому +1

    Hi, great video.
    At 8:20 Just a question: in the 2nd chart, the black line called Bayes Posterior Prediction, is that also the Posterior Predictive Distribution?
    I am just bit confused about notation I think..
    So in our Bayes formula for Bayesian regression we have P(B|Y, X) = P(Y, X|B)P(B)/P(Y, X); and what you got there with the new grain size data point is P(Y_hat|B, X=40) = integral P(Y_hat, B| X)dB = integral p(Y_hat|B, X=40)p(B|X)dB= integral[ p(Y_hat|B)p(B|X) ]dB .
    Thanks!

    • @binli2174
      @binli2174 2 роки тому

      I think the difference between the two predictive distributions is that the narrow one assumes no measurement variance (y is free of noise, only W has uncertainty); while in the broader one the measurements y also has variance.

  • @afzansa8469
    @afzansa8469 3 роки тому

    Really thanks for the explanation. Where i can learn the algorithm of mcmc without using pymc3 packcage? thanks

  • @basicmachines
    @basicmachines 2 роки тому

    These videos are really helping thanks. I have a question about the slide starting at 6 mins where you show the PDFs of the uncertainty of each parameter. How are these computed? They don’t seem to be normal distribution approximations but the true PDFs. Thanks.

  • @barbapapaplouf5419
    @barbapapaplouf5419 5 років тому +1

    Very nice and clear, thanks! It would have been great to have a link to the script you used for this example. Very hard to find on your git (didn't find it).

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  5 років тому +2

      github.com/GeostatsGuy/PythonNumericalDemos/blob/master/SubsurfaceDataAnalytics_BayesianRegression.ipynb
      Good point, I should start including the links to the code in the video comment!

    • @barbapapaplouf5419
      @barbapapaplouf5419 5 років тому

      @@GeostatsGuyLectures thanks!

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  5 років тому +1

      @@barbapapaplouf5419, great idea, I have added links to all the videos in this playlist.

  • @luisandrade5126
    @luisandrade5126 3 роки тому

    Very illustrative lecture, sir. I explored and executed your python codes. However, not all of them ran correctly, apparently incompatible with pymc3 3.11.1, for which I had to do some changes:
    import arviz as az
    Substitute the following commands: {pm.stats.summary, pm.traceplot, pm.plot_posterior, pm.forestplot} with: {az.summary, az.plot_trace, az.plot_posterior, az.plot_forest}, respectively.
    I haven't figured out why the model sampled 2 chains instead of 4 because at that point I didn't change any code from the published version. I don't know how to manually set it, if possible.
    I couldn't update the unresponsive pm.quantile command. Thereby I dismissed it and then had to set the substitute arviz commands az.plot_posterior and az.plot_forest without credible_interval settings.
    Any help is appreciated

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  3 роки тому

      Howdy Luis, Thank you for this. I've been super busy these days. I just ran the workflow and it completed without issue. I'm using pymc 3.8. I hope this helps, Michael

  • @albost1
    @albost1 4 роки тому

    Great video, very nicely explained! Do you have material on bayesian hypothesis testing? More specifically to know how to carry out a hypothesis test for the parameters of interest (for example in this case) and interpret the results.

    • @GeostatsGuyLectures
      @GeostatsGuyLectures  3 роки тому +1

      This is a great idea, Alejando2. I work in the frequentist and Bayesian domains, but I need to do more Bayesian lectures. That's on my list with Bayesian Neural Networks!

  • @univuniveral9713
    @univuniveral9713 4 роки тому

    I don't understand what Markov chains gotta do with this. Is it a time series?

    • @IndigentMartian
      @IndigentMartian 4 роки тому

      Try the previous video! ua-cam.com/video/7QX-yVboLhk/v-deo.html

  • @steadyknowledge3132
    @steadyknowledge3132 4 роки тому

    Sir, please explain Differential evolution Markov chain.

  • @joshualaferriere4530
    @joshualaferriere4530 3 роки тому

    I watched all 3 of your videos (11d-f), and you don't actually show how to create a bayesian linear regression (including sampling) from start to finish.