I was searching for "Wiener-Lévy process which is also a Markov process" but luckily I ended up here, a wonderful serendipity! Thanks for the simple and concise explanation Dr. Trefor.
Best explanation of Markov Chains I've seen. Most videos don't explain how you get the initial probabilities, but from your explanation I understood that they're equality distributed at outset (that is if I understood correctly) and can stabilize as frequency outcomes over iterations . Thank you. However, one thing that wasn't too clear on was if a Markov chain only depends on the current state of predicting future states, wouldn't a tree that predicts into the near or distant future states not be using the Markov property since there's a whole chain of dependencies?
I really appreciated your video. I have a question: in the market example with a drift rate of zero the transition probabilities would have been even? I mean 50 % probabilities of transition from bull to bear and vice versa? Thank you very much
What a great video, thank you!! Question: in your example of the stock market I think you used historical data to generate those bull-bull and bear-bear probabilities. Is it still a Markov process if the probabilities on the tree are derived from the past?
Ah great point, and it takes a bit of further consideration of what do we REALLY mean about "the past". It is more about ignoring the recent past. So I'm not looking at least week or a month ago in my predictions for next week. But your are right that while I made up the numbers in this example, they would have come from looking at some historical average over, say, decades. A similar example might be weather. A markov model might be constructed that says given the weather today, what is the probability of the weather tomorrow, and it would be markov because it ignores what the weather was like yesterday or a week ago. However, the model might still build in historical climate data about what the weather is like generally.
So basically we can consider markov chains as studying dependent probabilities? Also, how is is bear and bull example markovian? As you mentioned, that using an old data set (past info.) is non markovian process.
Thanks for video , it was mentioned Markov only based on “present” state however the transition probabilities themselves are based on historical data right? Just trying to get my head around that distinction.
You are so amazing Dr.Trefor making all these heavy content accessible to everyone, I give my best wishes to the channel growth exponentially. 🎊😇 I am doing my best to promote this content to everyone !!
Please make videos on vector calculus i.e. the calculus of vector fields. Please sir. There are no sources on UA-cam about this topic. Sir we don't know about any android software that can help us plot vector fields.
The best math video so far on Markov chains
The most thorough and clear explanation ever. Can't wait for the next video!
Thank you! Follow up video coming next week:)
A subscription to your channel is the gift that keeps on giving.
Glad you're enjoying!
Very well explained mate. Videos like these are a great into to a statistical topic and is a great foundation to dive deeper into the math behind it
You just made my life better all the way in a small country found in Africa called Kenya. Thank you.
one of the best explanations about Markov chains on youtube. thank you
A very well presented and insightful lesson.
Thank you for the 2 part explanation!
I was searching for "Wiener-Lévy process which is also a Markov process" but luckily I ended up here, a wonderful serendipity! Thanks for the simple and concise explanation Dr. Trefor.
Subscribed to this channel after about 30 seconds.... amazing
Very clear explanation with easy examples. Thank you!
amazing explanation. you have a knack for making these difficult topics understandable
thank you so much for this
Glad it was helpful!
I wathed many videos, but I understand mc at this video. Perfect explaination bro hank u
I watched around 5 videos , this explained it the best !! thanks alot
You a natural. Thanks , preparing for Risk Modeling and Survival analysis actuarial exam
This was such a wonderfully clear explanation. Thank you so much!
Glad it was helpful!
@@DrTrefor Not to get too greedy, but can you do one on Hidden Markov models? Thanks a bunch again!
I really appreciate the clarity of the explanation. I am now a subscriber and a fan! Thank you.
Thanks for the sub!
Love your videos man, keep up the great work. You and Presh Talwalker are the best.
Perfect. Dr. Trefor you are the best!
Thank you, glad you enjoyed!
I'm happy tha I found this channel. thank you!
A simple explanation of something that could be very complexly understood. Thank you, Dr. Bazett
what a sarcasm.
incredible video with a super clear explanation!
Never seen such explanation before. Amazing Sir
absolute delight.. exactly at the time I wanted it for an AI implementation
Crystal clear explanation , thank you Dr.
really helped to understand this concept in the first go
Best explanation of Markov Chains I've seen. Most videos don't explain how you get the initial probabilities, but from your explanation I understood that they're equality distributed at outset (that is if I understood correctly) and can stabilize as frequency outcomes over iterations . Thank you.
However, one thing that wasn't too clear on was if a Markov chain only depends on the current state of predicting future states, wouldn't a tree that predicts into the near or distant future states not be using the Markov property since there's a whole chain of dependencies?
Thank you soo much this is the best explanation ever❤❤❤
Very good explanation 👏
Love this!!!
Thank you for making this.
Thank you. This was a very good explanation
This was strikingly clear and fresh. Loved it!
Your explanation is super!
Thanks I needed this for my upcoming exams :-)
oh my for this was so helpful thank you so much
Thank you so much Doctor 😍
A very good explanation!
I really appreciated your video. I have a question: in the market example with a drift rate of zero the transition probabilities would have been even? I mean 50 % probabilities of transition from bull to bear and vice versa? Thank you very much
Rare moments.. when you understand in the first go.
Awesome explanation, Thank you sir. 👍
Great video. Thanks!
So good way of explanation
Please don't go to MIT, Standford at all, be here. We need you.
Awesome explanation!
Wow! Just beautiful! Love way you explain things!
Thank you so much!
What a great video, thank you!!
Question: in your example of the stock market I think you used historical data to generate those bull-bull and bear-bear probabilities. Is it still a Markov process if the probabilities on the tree are derived from the past?
Ah great point, and it takes a bit of further consideration of what do we REALLY mean about "the past". It is more about ignoring the recent past. So I'm not looking at least week or a month ago in my predictions for next week. But your are right that while I made up the numbers in this example, they would have come from looking at some historical average over, say, decades.
A similar example might be weather. A markov model might be constructed that says given the weather today, what is the probability of the weather tomorrow, and it would be markov because it ignores what the weather was like yesterday or a week ago. However, the model might still build in historical climate data about what the weather is like generally.
Dr. Trefor Bazett thanks, that makes sense! Looking forward to the next one!
If only the next video was out so I could make a Markov chain to predict when the next video will be out
Haha, well past behavior indicates I release a lot of videos on Monday’s, but you can’t look at that unless you model with a non-Markov process;)
Fantastic explanation
amazing explanation.
Very well explained.thank you
Wow 🔥🔥🔥
Great Sir.... the explanation is ridiculous ❤❤
عاشت الايادي استاذ شكرا جزيلا بالتوفيق والنجاح
awesome job
maan you are awsome THANK YOU!!!
Thank you dr, yours vidéo are usefull.
Thank you!
Did you do the Transition Matrix video from the 'coming soon' note ? Thanks
Yup, should be in the discrete math playlist
Can't appreciate enough your videos. Plz keep making them, and I am second. hehehe
Will do! 2nd is still pretty good, haha:D
nice explanation for a beginner like me
You may also talking about how to find the stable status with linear algebra PDP^-1 which related to your series
Indeed, the next video is going to cover the connection to linear algebra although I won’t get to diagonalization for a while yet
Excellent
Hey @Dr. Trefor Bazett Is this the Basics for Reinforcement Learning !!
Awesome thanks
So basically we can consider markov chains as studying dependent probabilities?
Also, how is is bear and bull example markovian? As you mentioned, that using an old data set (past info.) is non markovian process.
Thanks for video , it was mentioned Markov only based on “present” state however the transition probabilities themselves are based on historical data right? Just trying to get my head around that distinction.
Crown of mathematics🥰🥰💝💝♥️♥️
haha, thank you Shah!
Thanks!
You are so amazing Dr.Trefor making all these heavy content accessible to everyone, I give my best wishes to the channel growth exponentially. 🎊😇 I am doing my best to promote this content to everyone !!
Thanks for your help Dewanik, really appreciate it!
Thank you Dr very well explained 👌
Do you have any videos about poison process and exponential distribution
Thank you! Yes I do plan to do a stats series at some point, but not just yet sorry:)
better than my teacher
But for that example of stock market you have considered past data does it not make , it as an example of non-markov method?
what would be the differences to a finite state machine and the states position?
A superhero who can't fly but has a cape 😂
It's there to keep dry .. ? 🤨
A clear explanation, please, how can I contact you because I have some questions. Is there an email?
THANK YOU
So what kind of playlist would this Markov Chain belong to ?
Right now it’s in my discrete math playlist, but anything with probability or stats could talk about this.
Please make videos on vector calculus i.e. the calculus of vector fields. Please sir. There are no sources on UA-cam about this topic. Sir we don't know about any android software that can help us plot vector fields.
It's coming, starting in about 2 weeks!
@@DrTrefor Would you you recommend me a software on Android that can plot vector fields.
I would try navigating to wolframalpha on browser first I think, but I’m sure there are others
Best
In the final step, why did you multiply the probabilities of each branch and not adding them ?
I would like to know as well. I don't know the logic behind it.
Which came first, the Markov Chain or the DFA/NFA?
I desperately need to see a movie with the character he described: A superhero with an hour memory span😂
mark OV chain
This is counter-intuitive, the notion that experience has no value. Thanks.
shame about the audio
I dont think thats a cool superhero