Markov Chain Monte Carlo (MCMC) for Parameter Estimation (Matlab)

Поділитися
Вставка
  • Опубліковано 21 вер 2022
  • Hi everyone! This video is about how to implement the Markov Chain Monte Carlo (MCMC) method in Matlab, and how to use it to estimate parameters for an ODE model, using the logistic growth model as an example.
    Here's an introductory video I made on the logistic growth model:
    • Logistic Growth Model ...
    Here's a video I made on the conceptual basics of parameter estimation:
    • What is Parameter Esti...
    All the code from my videos is available on my Github:
    github.com/mikesaint-antoine/...
    Thanks for watching and let me know if you have any questions!
  • Наука та технологія

КОМЕНТАРІ • 38

  • @MikeSaintAntoine
    @MikeSaintAntoine  Рік тому +3

    Thank you guys for the nice comments! Btw here is the link to the code on Github in case you want to download it and try it yourself (also feel free to add me as a friend on Github if you want 🙂):
    github.com/mikesaint-antoine/Comp_Bio_Tutorials/tree/main/matlab_code/MATLAB_mcmc

  • @WajidAli-dv6cb
    @WajidAli-dv6cb Рік тому +4

    Hi Mike! Thank you very much for the effort. I was waiting for it.

    • @MikeSaintAntoine
      @MikeSaintAntoine  Рік тому +1

      No problem Wajid! Thanks for watching and let me know if you have any questions 🙂

  • @WajidAli-dv6cb
    @WajidAli-dv6cb Рік тому +3

    Hi Mike! I thoroughly enjoyed the video. You greatly explain every bit of the algorithm. Again many thanks and best wishes!

    • @MikeSaintAntoine
      @MikeSaintAntoine  Рік тому

      Thanks Wajid! It means a lot to know that some people are actually finding my videos helpful 🙂 good luck with your studies and let me know if there's any more topics you want to hear about in the future!

  • @OghmaNano
    @OghmaNano 3 місяці тому

    I really like this video and keep coming back to it. I think it's one of the most clear explanations of MCMC I've seen. You are a very good instructor,

    • @MikeSaintAntoine
      @MikeSaintAntoine  3 місяці тому +1

      Thanks for the kind words! Yeah it's nice to know people are finding these videos helpful. Let me know if you have any questions 🙂

  • @patriciacharagu4755
    @patriciacharagu4755 11 місяців тому

    This is my first time watching your videos.. Thank you so much. Very well explained 👏

    • @MikeSaintAntoine
      @MikeSaintAntoine  11 місяців тому

      Thanks for watching Patricia, and let me know if you have any questions 🙂

  • @urpaljp
    @urpaljp Рік тому

    Hey Mike. Thanks for the video. Currently taking a topics course in grad school about parameter estimation so this is gonna be a fun watch! I was finally able to use fminsearch and fmincon with some help of your videos using my ownn model and experimental data. Also managed to code our model by converting it into CTMC. Keep on making these videos!

    • @MikeSaintAntoine
      @MikeSaintAntoine  Рік тому

      Glad to hear it and thank you! Let me know if you have any questions, or any requests for topics you want to hear about 🙂

  • @academyofmaths8624
    @academyofmaths8624 Рік тому +2

    Great respected Mike it is a helpful video for me

  • @rafiqulislam1085
    @rafiqulislam1085 10 місяців тому

    Thanks a lot for such an excellent video on very important topic. Best wishes !

  • @staysafe3418
    @staysafe3418 Рік тому +1

    Great men create great things

  • @muneebafridi6099
    @muneebafridi6099 Рік тому +1

    Requested by wajid to watch this video, so here i am...
    #keepUpTheGoodWork 👏👏👏

  • @ali_writes6851
    @ali_writes6851 Рік тому +1

    Its good video and really useful.

  • @fahadal-abri5406
    @fahadal-abri5406 Рік тому

    Great effort 👍

  • @wasifkhan6991
    @wasifkhan6991 Рік тому

    its very helpfull for us thank you sir.

  • @ehoomaan
    @ehoomaan Рік тому

    Hi Mike! Thank you for this great video! You perfectly explained the method. I have a question about the likelihood function (ll_function). For the assumed normal distribution, the point that pdf is evaluated should not be the difference between the model prediction and dataset (exp_data - ode output) with the mean and standard deviation of measurement errors?

    • @MikeSaintAntoine
      @MikeSaintAntoine  Рік тому +1

      Hi Ehssan, thanks for watching! I'm not sure if I 100% understand your question but I'll try to explain what's happening with the ll_funciton and hopefully that will help.
      Basically the function takes the form pdf('Normal',x,mu,sigma), where mu is the mean of the normal distribution, sigma is the standard deviation, and x is the data point we have. The function returns the probability density function value at point x (the experimental data point), assuming a normal distribution with a mean set according to the ODE results. And then I set the standard deviation to be very high as a bit of a hack to avoid running into computational errors due to extremely small numbers. The whole point of all of this is just to give us higher numbers when mu (ode output) is close to x (experimental datapoint), and lower numbers when x is far away from mu.
      It could also be fine to subtract (exp_data - ode output) and take the absolute value, but then you need to set mu in the pdf function to be 0. So pdf('Normal',abs(exp_data - ode_output),0,sigma). Then this will also give you higher numbers when the exp_data is close to the ode_output, and lower numbers when they're farther away. But in my opinion this just makes things more complicated, and it's more straightforward to just evaluate the pdf with mu=ode_output and x=exp_data.
      Does that make sense? If you're having any trouble with it, you can also email me your code at mikest@udel.edu and I can take a look. Thanks for watching! 🙂

  • @bikrampal1052
    @bikrampal1052 Рік тому

    Thank you for this informative video, its helps me a lot. I just want to ask, when I am trying to make a code for the parameter estimation of system of ODEs(like SIR, Lotka-Volterra Model ) I an facing some difficulties of generating Log likelihood function and datagen function (as you have mentioned in this videos)

    • @MikeSaintAntoine
      @MikeSaintAntoine  Рік тому

      Hi Bikram! Sorry about the late reply, unfortunately I was very busy for the past month because I was finishing up my PhD dissertation and also had Covid for like 2 weeks.
      Have you made any progress on your code? If you're still having trouble with it, please email me at mikest@udel.edu and I can try to help! 🙂

  • @Mostafa_99952
    @Mostafa_99952 3 місяці тому

    Hello Mike, I hope you are doing well. I always watch your UA-cam videos and learn a lot. I just saw one of your videos where you used the MCMC method for estimating parameters in MATLAB. I just have some simple questions about this method.
    (1) Is this MCMC method for parameter estimation the same as the Bayesian method? If not, then in which part of the code we can modify to make this Bayesian? If you share the code, then I will be grateful.
    (2) If we want to apply MCMC to a model where we have two or three species, then do we still keep the "normal" distribution that you did for the very last portion of the code? For example, for a two-species model, do we modify it like this ---
    loglikelihood = sum(log(pdf('Normal',data(:,2:3),testdata(:,2:3),1000)))
    Does that mean we just need to change the dimensions of the data and testdata only?
    (3) How do we transition from "normal" to "gamma" or "uniform" for the previous line? Also, I saw that the final output graph for estimated parameters for "r" and "K" looked like a normal distribution graph. You used the histogram option to plot the estimated parameters. My question is: is it the case that you choose "Normal" inside the loglikelihood code, I mean, the above code that I indicate in my question (2)? If I want to plot it like a normal-shaped curve, how do I graph it without a histogram? I tried the command plot(paraset(1,:)) in Matlab, but I don't know what's my y axis here.
    (4) If you share the background theory and reading behind this code, then it will be great. I can read and clear up some of the questions that I am looking for.
    (5) Do you have the Matlab code for the stochastic version of the MCMC of this example? How do we do the stochastic version of it?
    Thanks so much!

    • @MikeSaintAntoine
      @MikeSaintAntoine  3 місяці тому

      Hey Mostafa! Sorry about the late response on here, but I got your email and replied to it, so hopefully that will help answer some of your questions.
      Thanks for watching! 🙂

  • @parthasakhadas4930
    @parthasakhadas4930 9 місяців тому

    Hi Mike, This a great video. Could you help me for SIR model..

    • @MikeSaintAntoine
      @MikeSaintAntoine  9 місяців тому

      Hi Partha! Got your Facebook message, I'll try to take a look at your code and hopefully make some progress on it this week 🙂

  • @benklassen77
    @benklassen77 Місяць тому

    This is very helpful for my research. How would you change the loglikelihood code if you wanted to account for the prior probabilities, noting that P(theta|D) ∝ P(D|theta)P(theta) rather than just P(theta|D) ∝ P(D|theta)?

    • @MikeSaintAntoine
      @MikeSaintAntoine  24 дні тому

      Sorry about the late response! The way to account for prior probabilities is with the distribution that you're drawing your parameter guesses from. So if you have some guess about what the true parameter values might be, you could draw guesses from a distribution centered around those guesses, rather than from a uniform distribution (which doesn't incorporate any prior information, other than the possible range).
      If you need any help with this, feel free to send me an email at mikest@udel.edu.

  • @sajanphutela4818
    @sajanphutela4818 2 місяці тому

    Thanks mike for such nice video on MCMC. This is first time, I could understand this. What is you email address? I I have some questions if you could help me with that?

    • @MikeSaintAntoine
      @MikeSaintAntoine  2 місяці тому

      Thanks for watching Sajan! And sorry about the late response but yes I got your email and replied to it 🙂