Markov Chains Clearly Explained! Part - 1

Поділитися
Вставка
  • Опубліковано 16 гру 2024

КОМЕНТАРІ • 682

  • @NormalizedNerd
    @NormalizedNerd  4 роки тому +240

    Since many of you are asking about the calculation of left eigenvector (π)...Here are the equations:
    from πA = π
    0.2x + 0.3y + 0.5z = x
    0.6x=y
    0.2x+0.7y+0.5z=z
    from π[1]+π[2]+π[3] = 1
    x+y+z=1

    • @arianakenzie4235
      @arianakenzie4235 3 роки тому +3

      Dude you should collaborate with @ahmadbazzi

    • @putraduha3176
      @putraduha3176 2 роки тому +2

      Thanks man, online school isn't really being nice to my brain

    • @dhruvsingla2212
      @dhruvsingla2212 2 роки тому +3

      Hey, can you also tell how to code moving from one state to another based on probability? Like you did a random probability walk, how did the code decide which state to go to using probability.

    • @NormalizedNerd
      @NormalizedNerd  2 роки тому +3

      @@dhruvsingla2212 I think you are looking for this video: ua-cam.com/video/G7FIQ9fXl6U/v-deo.html

    • @dhruvsingla2212
      @dhruvsingla2212 2 роки тому

      @@NormalizedNerd Great, thanks 👍

  • @nshiba
    @nshiba 3 роки тому +356

    This is sooo easy to understand. I took atleast a month to learn this about 25 yrs back for my masters thesis work when I first learnt this subject. Now, I thought of revisiting this topic for my daughter's higher secondary project. 25 years have really brought a topic from masters to secondary school level and months of learning to a few minutes of a well prepared video. Thanks to your channel, UA-cam, Internet and technology in general. 🙏

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +30

      Thanks a lot for sharing your experience. It really feels nice to read such comments. I'm glad have to have a platform like this.

    • @NSFWrated
      @NSFWrated Рік тому +8

      Which institute is teaching this in secondary level.plz provide syllabus link .i am redesigning a syllbus

    • @Nuur_Rajput
      @Nuur_Rajput Рік тому +2

      ​@@NSFWrated yea. I'm curious too

    • @_rd_kocaman
      @_rd_kocaman 11 місяців тому +2

      where the heck does this studied in high school?

    • @nshiba
      @nshiba 11 місяців тому +2

      Sorry I missed all your comments. Markov Chains is one of the IA (Internal Assessment) topics to choose from for the HL (Higher level ?) Mathematics for IB (International Baccalaureate Diploma) programme ( which is higher secondary school level - Year 11 and Year 12) in Singapore. You can Google to find the details with the above information.

  • @michaella5110
    @michaella5110 Рік тому +13

    you have no clue how much you helped a bunch of online MS Analytics students. Thank you so much!

  • @counter-thought2226
    @counter-thought2226 Рік тому +19

    This is a lifesaver. I started a stochastics class last week with an almost nonexistent background in probability. I was completely troubled at first but after watching this video and reading through some course material, I can actually understand the exercises. Thank you.

  • @DawgFL
    @DawgFL 3 роки тому +122

    Thanks dude. It takes a whole nother level of intelligence to be able to break down a concept like this so anyone can understand it. I'm learning markov chains in class right now and when the professor teaches it it literally looks like an alien language to me, i almost broke down because i might fail the class. but im going to watch all ur videos and itll help me a lot.

  • @karannchew2534
    @karannchew2534 2 роки тому +99

    Note for my future revision.
    Markov Chain models a system that changes its status.
    One important rule: the next status of the system only depends on its current status.
    Status
    = serve pizza, serve burger or serve hotdog
    = x, y, z
    = Connected, Disconnected, Terminated, Active
    Markov chain can be drawn as a state diagram.
    Or written as a transition matrix.
    State diagram represents all possible status and associated probabilities.
    Transition matrix
    = represent the state diagram
    = probability from one state to another
    = A
    At equilibrium, the probabilities of the next status doesn't change any more. The probability of state at equilibrium = Stationary Distribution.
    Let's call such equilibrium probability π.
    Aπ = π
    π
    = Eigenvector of the matrix
    = Probabilities of each status the system could be in, assuming equilibrium stage.
    Using two equations:
    A) Aπ = π
    B) sum of probability is 1,
    we can work out the value of π, i.e. the equilibrium probability
    Alternatively, run a simulation.
    A: Do all Markov Chain have a equilibrium state?
    Q: Don't know... Need to study more to find out...
    Q: Can I use Subscriber Status to as the hidden state?
    A: Yes. But if the status is known, then it's better to use it as the Observation states.
    Q: Can I "model" next status to be only depending on the current status? But then the next status actually also depend on the previous status, this seems contradictory.
    A: Yes, I can. At per state level, the next status only depends on the current status. But the at the system level and at equilibrium, it "depends" on both the current and the previous state, because the current states had been "affected" by the previous states.

    • @Ceratops17
      @Ceratops17 2 роки тому +6

      hii, maybe if you still need the answer. You can prove that an ergodic Markov chain, so a chain where all states communicate with each other and it’s aperiodic (gcd is 1) always has an equilibrium state.

    • @chaityashah4221
      @chaityashah4221 8 місяців тому +1

      dont know if you revised it , but i surely did a revision

    • @blackbocks
      @blackbocks Місяць тому

      "Using two equations:
      A) Aπ = π
      B) sum of probability is 1"
      Do you mean πA since we can't do Aπ?

  • @ImJoegath
    @ImJoegath 4 роки тому +88

    Got way more excited than I should have when I thought "hmm, that kinda looks like the eigen vectors..." AND THEN IT WAS.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +3

      Haha...

    • @dnbhattacharya343
      @dnbhattacharya343 3 роки тому +3

      Holy crap we are nerdy.

    • @fpartidafpartida
      @fpartidafpartida 2 місяці тому

      @@dnbhattacharya343I came here to try to understand Markov chains, after a mathematician Dr. casually mentioned them during an AI talk. I completely get the concept now, but I could not actually perform any of the equations. Does that still count as being nerdy for me? 🧐

  • @ishankaul9065
    @ishankaul9065 4 роки тому +27

    Great explanation! A full series on the different types of Markov chains with explanations like this would be awesome.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +3

      I'll try to work on this

    • @themathskompanyap4730
      @themathskompanyap4730 2 роки тому

      Subscribe for more such Markov chain concepts friends. ua-cam.com/video/bk3MjAC9QsY/v-deo.html

  • @bubblewrap55
    @bubblewrap55 4 роки тому +32

    Good explanation, they never covered why am I calculating eigen values in high school, loved how that path and random walk converged in the end.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +1

      Thanks!! Yeah...they teach stuffs without their applications :(

    • @abc4924
      @abc4924 4 роки тому +5

      You guys calculate eigenvalues at High school? Great!

    • @manishmayank4199
      @manishmayank4199 3 роки тому +2

      @@abc4924 my reaction was same...I studied eigenvalues in my 2nd semester of college

  • @LucasSteinberger-g9d
    @LucasSteinberger-g9d 3 місяці тому

    This video is literally perfect as an instructional. It has a limited scope, and everything it shows feels obvious and related to what was just shown previously. Thanks for the good work!

  • @angrybruce8262
    @angrybruce8262 3 роки тому +353

    Mate, that is a good explanation! The only problem is that now I AM HUNGRY:)

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +11

      Haha XD

    • @69erthx1138
      @69erthx1138 3 роки тому

      After burgers, Will took Skulyer for pizza, then give her a night cap with his hot dog.

    • @2highbruh
      @2highbruh 3 роки тому +1

      @@69erthx1138 oh, okay, good for him

    • @themathskompanyap4730
      @themathskompanyap4730 2 роки тому

      Subscribe for more such Markov chain concepts friends. ua-cam.com/video/bk3MjAC9QsY/v-deo.html

    • @muhammadihsan6645
      @muhammadihsan6645 2 роки тому

      Woowww , human being human

  • @pemessh
    @pemessh 4 роки тому +13

    You sir, just earned a subscriber.
    These kinds of quality videos and great explanation is what we love.
    Thank you.
    Best wishes from Nepal.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +1

      Thanks and welcome to normalized nerd :)

    • @themathskompanyap4730
      @themathskompanyap4730 2 роки тому

      Subscribe for more such Markov chain concepts friends. ua-cam.com/video/bk3MjAC9QsY/v-deo.html

  • @opencode69
    @opencode69 Рік тому

    While i was trying to understand this i avoided complex terms the best i can but with this video i have no need to try avoiding it because of the thorough explanation typing this 2 years later the "whats up people of the future" really got me

  • @rebeccacarroll8385
    @rebeccacarroll8385 Рік тому

    This is the best video ever. Seriously, I was ripping my hair about these concepts and this bridges each point beautifully.

  • @RichardVaught
    @RichardVaught 2 роки тому +1

    This was a good explanation, with one exception. There is a REASON that pi can used, and why pi is used. I know that most mathematicians take the connection between frequency and pi for granted, but a lot of folks really don't have an intuition regarding that.

  • @Octane09
    @Octane09 Рік тому +1

    That was one of the smoothest explanations i ever came across !

  • @jiangxu3895
    @jiangxu3895 8 місяців тому

    Dude, this is the first time I get the idea of Markov chain. Thanks a lot!!!

  • @piotrgorczyca5548
    @piotrgorczyca5548 4 роки тому +10

    5:24 I feel you bro, recording entire audio and then finding out about the mistake just at the editing ... I did the same, just cut words from other parts of the recording and put them to create a sentence...
    Thanks for the video btw, very nice.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +2

      Exactly bro :/

    • @rg31222
      @rg31222 3 роки тому +1

      So true...so much time and effort goes into creating any content especially audio and video...great video and great channel.

    • @georgewhichello4708
      @georgewhichello4708 3 місяці тому

      @@NormalizedNerd 8:13 you say 46% instead of 43% also

  • @vinx3078
    @vinx3078 2 роки тому

    I'm here from ddlc and I could not understand a thing until I saw this video. Dude is the most helpful guy on this site

  • @willbutplural
    @willbutplural 2 роки тому +11

    Wow great explanation that includes terminology, stationary states, and connections between adjacency matrices, directed graphs, and markov chains 👍 A+ thank you!

  • @karlrombauts4909
    @karlrombauts4909 2 роки тому +4

    This is such a fantastic video. It makes all the concepts very easy to understand without skipping important technical details. Thank you for making such a great resource!

  • @lyzhenyang2982
    @lyzhenyang2982 Місяць тому

    This is so clear I love you for the rest of my life. I swear half of my tuition fees should go to you and 3b1b.

  • @dieserhugo2960
    @dieserhugo2960 11 місяців тому +3

    Jeez, if my professor had introduced Markov chains like this instead of spending multiple lectures talking about Google's page-rank system without any goal in mind, I would've saved myself a lot of confusion. Thank you!

  • @Reigatsu
    @Reigatsu 2 роки тому +4

    Great video! As a physics graduate, it’s honestly surprising how often eigenvalues and eigenvector keep showing up in what I do!

    • @n-panda921
      @n-panda921 2 роки тому

      ya! and you can really think this in terms of quantum mechanics too, I like all these connections

  • @theelysium1597
    @theelysium1597 3 роки тому +10

    This is a great video! I am currently having Linear Algebra II and Probability (2 separat courses) and this video perfectly connected them :) thank you!

  • @IshanBanerjee
    @IshanBanerjee 2 роки тому +4

    I was trying to understand Evolution algebras and for that I needed idea of Markov chains. Beautifully explained. Thank you so much.

  • @wakabaka777
    @wakabaka777 9 місяців тому +1

    Wonderful explanation! I love this visualization

  • @gameboardgames
    @gameboardgames 2 роки тому +3

    This video was really well constructed and interesting, in equal measure to being informative! Thank you Mr Nerd!

  • @how_about_naw
    @how_about_naw 10 місяців тому +2

    Dude, you need to add a thanks button and let us buy you a coffee.

  • @rommix0
    @rommix0 Рік тому +4

    This is so cool. I'm getting into machine learning, and videos like these are extremely help. I've only really come across HMM for its historic use in Speech Recognition.

  • @dr-x-robotnik
    @dr-x-robotnik 3 роки тому +2

    This tutorial helped me with my NLP project on part-of-speech-tagging. Thank you very much!

  • @johnperkins6550
    @johnperkins6550 3 роки тому +1

    I am just starting to learn this. The best explanation of all the videos I have seen.. Very understandable. And there is the application to Python and Data Science as BONUS! I am subscribed and I want to see all of the videos now!!!

  • @aromalas5713
    @aromalas5713 2 роки тому +2

    Omg man this is such a great explanation. Loved the presentation, the animation and everything about it. Keep going!

  • @3munchenman
    @3munchenman 3 роки тому

    You explained it to me like I am 5 years old. And that is what I needed. Thank you!

  • @aydnaydin9109
    @aydnaydin9109 Рік тому

    perfect explanation.. everybody can understand. this video may be the easiest explanation for this topic. THANK YOU !!!

  • @ciberman
    @ciberman 3 роки тому +34

    "Please pause the video if you need a moment to convince yourself"
    What kind of 3blue1brown is that?!

  • @marclennardcolina6033
    @marclennardcolina6033 4 роки тому +8

    Great Explanation. Learned a lot from these! I would also like to ask for permission to cite your examples in a report I'm about to make in my masters class.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +2

      Yes, absolutely. Best of luck for your report :D

  • @happyduck70
    @happyduck70 3 роки тому +2

    3:35 I wish my days were like this, such beautiful music while walking and eating unhealthy food

  • @ratchanonsupakit4375
    @ratchanonsupakit4375 2 роки тому +1

    Too good! Got baited from the food snapshot, but the content inside is superb.

  • @praneetkumarpatra2661
    @praneetkumarpatra2661 2 роки тому

    my mind is blown!!! evry new thing that was covered in my course in the last 1 month just got used here!!!

    • @NormalizedNerd
      @NormalizedNerd  2 роки тому

      Haha...don't you like when that happens 😍

  • @muhtasirimran
    @muhtasirimran 2 роки тому

    1:05 I would like to think it reverse as we are people from future. If I know what they are serving tomorrow, I can predict what they are serving today

  • @floriantschelisnig2332
    @floriantschelisnig2332 3 роки тому +4

    Thanks, after watching this video my university skript makes now much more sense.

  • @akhilgoenka6817
    @akhilgoenka6817 2 роки тому +1

    Found this awesome channel today. Fantastic visuals & crystal clear explanation. Subscribed!

  • @zoltanmisley666
    @zoltanmisley666 2 роки тому

    "Please pause for a second to convince yourself"
    dude...
    that's just straight up savage

  • @taquakhairysaeed1771
    @taquakhairysaeed1771 3 роки тому

    wow this is the best technical video i have ever seen!! Well done!

  • @naimahersy3966
    @naimahersy3966 2 роки тому

    After watching this video. Nerd has been normalized. Amazing 😊

  • @umarkhan-hu7yt
    @umarkhan-hu7yt 2 роки тому

    You make it clear and more intuitive. Thanks

  • @Kosake82
    @Kosake82 2 роки тому

    @7:45 "Please pause the video if you need a moment to convince yourself."
    That gave me a good laugh for some reason.😄

  • @bhushanakerkar6441
    @bhushanakerkar6441 Рік тому

    excellent explanation. Just too good to be true. You have made an esoteric subject so simple

  • @annabaannaba6994
    @annabaannaba6994 2 роки тому +1

    Very vice tutorial and Excellent video

  • @lucasqwert1
    @lucasqwert1 Рік тому +7

    Thank you only one thing I didn't get after πA = π and sum of probability is 1, how do we calculate the stationary state as π =[25/71 15/ 71 31/71 ] ?

    • @philmcgroin
      @philmcgroin 3 місяці тому

      Yep this bit of the video is not great. It assumed we already knew how to do this and showed no calculation steps. On the whole, I'm not sure who this video was aimed at. Total newbie who needs pizza and hot dog explanation (that's me), or people who already knew this

  • @franciss.fernandez7581
    @franciss.fernandez7581 3 роки тому +3

    This was an amazing video. You're an outstanding instructor!

  • @georgeiskander2458
    @georgeiskander2458 Рік тому

    Really awesome. I never understood this topic as easily as you did..................
    Thanks

  • @XiaohanGao
    @XiaohanGao 10 місяців тому +1

    Informative and super clear!!! Thx!

  • @AnupKumar-nz2qq
    @AnupKumar-nz2qq 11 місяців тому

    It's a very nice video to understand the Markov chain model in a simplified way. Please make more such videos on the Markov model and stochastic process.

  • @JackMenendez
    @JackMenendez Рік тому

    Wow, thank you. Why was this so hard for me back in the day? Great job.

  • @muhammadwaseem_
    @muhammadwaseem_ 2 роки тому +1

    Fell in love with your channel and content quality....

  • @cathlinbarki224
    @cathlinbarki224 3 роки тому +1

    it amazes me still that how youtube can teaches me more than my own college teacher :))

    • @xaviermagnus8310
      @xaviermagnus8310 3 роки тому

      UA-cam is bigger and has smarter people in the end.

  • @theodoresweger4948
    @theodoresweger4948 Рік тому

    I watched the movie "Beautiful Mind'" and the random walk comes to mind, I found this quite interesting along with the Ignamac macine the Germans used and how it was the code was finally broken, and the consequences of giving away the code was broken and how that would effect WWII.

  • @andrejlucny
    @andrejlucny 3 роки тому

    The graph with probabilities is Markov model. Just a sequence of random variables related to the model is Markov chain.

  • @yashshah4182
    @yashshah4182 11 місяців тому

    What a great introduction to Markov Chains! Thank you, it was really helpful

  • @Elias-hk9sc
    @Elias-hk9sc 5 місяців тому

    Wonderful Video. It was perfect to get a simple overview on markov chains.

  • @argish
    @argish 2 роки тому +1

    dude when that voice over said "a directed graph" I legit thought some alien invaded my room 💀

  • @traj250
    @traj250 2 роки тому

    Awesome video. Undergad student that really appreciates this simplification

  • @techie1143
    @techie1143 2 роки тому

    Very good explanation with clarity. It will be greatly appreciated if you could include the script of these videos.

  • @ralphvonchunjo
    @ralphvonchunjo 3 роки тому

    Thanks from the future! Great explanation, outstanding instructor!

  • @MrRaja
    @MrRaja Рік тому

    Thanks for the explanation. It's starting to make sense. Little by little.

  • @mamotivated
    @mamotivated 2 роки тому +1

    Great content, thanks for sharing. Your education is helping lots of people. Keep going.

  • @pratikshakharat8644
    @pratikshakharat8644 Рік тому

    Yes yes we want more with such interesting examples

  • @pujasangwan8008
    @pujasangwan8008 2 місяці тому

    Fantastic explanations with the help of daily life examples.thank you a lot

  • @nicholasadegbe4629
    @nicholasadegbe4629 3 роки тому

    I covered just 2 minutes of this and I'm so excited!!!

  • @dragonAwkward
    @dragonAwkward 4 роки тому

    Very good explanation brother! Keep it up!!

  • @chloewei768
    @chloewei768 4 роки тому +2

    Awesome explanation!! It is so beginner friendly and I love it!!
    Thank you! and look forward to seeing more content from you!!

  • @casestudy3167
    @casestudy3167 2 роки тому +1

    very well explained. thank you for making this video

  • @SioneVaitaiki-f2z
    @SioneVaitaiki-f2z Рік тому

    06:00 PI row vector . Probability distribution of the states.

  • @aboagyeolas9434
    @aboagyeolas9434 4 роки тому +5

    Great Man! You must be grand grand son of Markov.

  • @JeffLuntGames
    @JeffLuntGames 11 місяців тому

    Cool video - watching the whole series now.

  • @shrur3527
    @shrur3527 9 місяців тому +1

    Tq so much🙏🙏❤️❤️

  • @anubhavyadav4279
    @anubhavyadav4279 3 роки тому

    You made it look so simple! Amazing man!

  • @panpeter7879
    @panpeter7879 8 місяців тому

    Have to say this is a very very helpful video for understanding MCMC 🎉🎉🎉

  • @jeevanmarg
    @jeevanmarg 4 роки тому +1

    Excellent demonstration. Really helpful. Thank you.

  • @aftabasir7933
    @aftabasir7933 Рік тому

    Well made and well visualized. Good job.

  • @avasaralavivekaditya1981
    @avasaralavivekaditya1981 2 роки тому +2

    At 2:43 , how did we get 0.7

  • @eventhisidistaken
    @eventhisidistaken 3 роки тому +4

    Thank you, this was very helpful. If you decide to expand on the video, the one thing that was not immediately clear to me, was *why* pi represents the stationary state probabilities. I had to write out the probability equations for that to become clear.

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +1

      Thanks for this feedback :D

    • @RamakrishnanRukmini
      @RamakrishnanRukmini 3 роки тому

      That is the value reached by the system finally. No more variation. End state values. Hence stationary state.

  • @emrullahcelik7704
    @emrullahcelik7704 3 роки тому

    Very concise explanation. Thank you.

  • @sharadchandakacherla8268
    @sharadchandakacherla8268 2 роки тому

    this video made me subscribe to the channel. simplicity always wins

  • @KN-ls9rq
    @KN-ls9rq 3 роки тому

    woah, that was an awesome video man! I think i'll be watching your videos just for fun too! keep doing what you do 👍

  • @chriswil8252
    @chriswil8252 2 роки тому

    Helpful. Honestly it’s too helpful.

  • @aliman5827
    @aliman5827 2 роки тому

    Oh God this was the series I actually needed! tnx bro!!!

  • @huitv1
    @huitv1 3 роки тому +2

    Very clear explanation, well done! ty

  • @ashfvt7712
    @ashfvt7712 3 роки тому

    You are making me hungry😂.
    This was a great video. The presentation was very neat.

  • @Julian-tf8nj
    @Julian-tf8nj Рік тому

    the pronunciation of "pizza" that sounds like the town of "Pisa" cracks me up (I'm an Italian speaker)... but the explanations are superbly clear and helpful, thanks! 😁

  • @chuanyu8813
    @chuanyu8813 2 роки тому

    Amazing, I like those beautiful pictures of food to explain Markov Chain!

  • @mariusbaur6765
    @mariusbaur6765 Рік тому

    thanks a lot! incredible how you can explain a difficult topic in such an easy way!

  • @erzlet2556
    @erzlet2556 2 місяці тому

    What calculation did you do to get the values at 6:52? where did those numbers come from

    • @DrChrisCopeland
      @DrChrisCopeland Місяць тому

      this is where I got lost as well. the second, third, nth application wasn't clear to me.

  • @mgetommy
    @mgetommy 3 роки тому +1

    10x better than my professors explanation...

    • @69erthx1138
      @69erthx1138 3 роки тому +1

      Through years of auditing grad level courses, I'd take the advise of the average student or post-doc over a senior faculty type any day.

  • @weekendresearcher
    @weekendresearcher 2 роки тому

    Life saver video 👍 prerequisite to understand markov regime switching model in econometrics

  • @hunterhughes2589
    @hunterhughes2589 Рік тому

    Great video! A quick question and I apologize if someone already asked it:
    Intuitively, I would assume that the probability of any given state is simply the sum of the probabilities for that state divided by the sum of the probabilities for all states. For example, the probability for pizza is 0.2, because 0.6/3.0. The answers you get from the pi calculation and the random walk are very close to this, but not exact. What gives?

  • @zinniye
    @zinniye 4 роки тому +3

    This helped me so much! Thank you!

  • @rishikambhampati2862
    @rishikambhampati2862 2 роки тому +4

    Hello, Thanks for the wonderful explanation. I have a naive question though, how did we arrive at the adjacency matrix and directed graph with probabilities in the first place? Is it from observations or domain knowledge(in this case, will restaurant give us the probabilities)?