This markov chain series get fewer and fewer views as the series continues, but the up to down ratio maintains. Hope I can make it to the last one. Thumb up!! Update after watching: I initially tried to learn the Markov Chains because of my learning of simulation of thermal dynamics need this knowlege, but after viewing this episode, I learned how the Markov Chains model was generated in natural language processing. Given this topic is so hot recently, thank you Sir for your work!!
The final random generations are really terrific! This one is quite Zen and appropriate to itself as being generated via semi random function: "I would call your cipher was not difficult for me to point that way Holmes shook his head.' Great video! You explain these complicated things at just the right pace to follow along, but not miss anything important in their summary or simplification for explaination. Thanks!
I'm not certain about this but you might be able to simplify your dictionary logic if you use a Counter data type instead of manually checking for existing keys and incrementing them. Excellent video!
Everything is really well put in this video! I don’t need any pencil, paper, keyboard or whayever to learn a useful example of Markov model. Nice work.
Please do a video on Forward Backward algorithm, Viterbi, and Baum-Welch.....you can explain very well. Thanks in Advance. You are a great teacher, honestly.
Man this is SO interesting I had to prepare for an interview in ML and you saved my life thank you !!!! But LMAO i wasnt ready for the "i ejaculated" part
Notes for my future revision: 10:25 "Depending on the starting state, the model will generate a new story" 1. Get the data (text) 2. Create the states (bigram), count the next states and the probability of next state 3. Create a MCM 4. Pick an initial state (bigram) and generate a sequence of next N state (bigram), to form a sentence It's a probabilistic model, so "Depending on the starting state, the model will generate a new sentence". In fact, even with the same starting state, the model will generate a new sentence each time it "predict" a sentence.
Great presentation! One modification needed for increasing number of words in a state equal to more than 2 as mentioned at @6:50 . Line 3 of "make_markov_model" function should be replaced by this for i in range(len(words)-2*n_gram+1): otherwise an error will occur for list out of index when n_gram>=3
Excellent explanation! I wonder if it would be viable to count certain punctuation symbols (like "." and ",") as 'words' in the model, such that the "game is" state might yield something like "afoot." as one possible transition? That way the model could generate discreet sentences and clauses
This was an amazing video. Thank you so much! A question: If we increased the n_gram value for the model, would the quality of the stories also improve significantly?
Fellow Human, The subtitles for this video are currently not viable in English D: The videos on this series so far had 'em. I do not know how to help in this matter. Well, thanks for the previous videos.
Hi man, i tried ur code and i get error on "random.choises", always a KeyError. I tried the code on Colab, jupyter notebook, VsCode and ur Kaggle notebook.. and only on this last one works. Have u tried to run ur code on other notebook ?
I never would have imagined that something so powerful could be implemented in so little code; or explained so succintly. Fantastic video.
Thanks a lot!
If the code is short, it's because there is a lot of abstraction. Just imagine if you had to rewrite all the built-in functions...
Giving a hands-on practical example to complement the theory was just icing on the cake. Thank you so much
This markov chain series get fewer and fewer views as the series continues, but the up to down ratio maintains. Hope I can make it to the last one. Thumb up!!
Update after watching: I initially tried to learn the Markov Chains because of my learning of simulation of thermal dynamics need this knowlege, but after viewing this episode, I learned how the Markov Chains model was generated in natural language processing. Given this topic is so hot recently, thank you Sir for your work!!
The final random generations are really terrific! This one is quite Zen and appropriate to itself as being generated via semi random function: "I would call your cipher was not difficult for me to point that way Holmes shook his head.' Great video! You explain these complicated things at just the right pace to follow along, but not miss anything important in their summary or simplification for explaination. Thanks!
Glad you enjoyed it :D
I'm not certain about this but you might be able to simplify your dictionary logic if you use a Counter data type instead of manually checking for existing keys and incrementing them. Excellent video!
The most amazing thing about this idea is that it is really simple and effective!
Everything is really well put in this video! I don’t need any pencil, paper, keyboard or whayever to learn a useful example of Markov model. Nice work.
Amazing! Now I understand the basics. I hope when I will try to implement this myself I will gain a full understanding of what is going on.
These Markov chain videos are amazing. Thanks mate.
Please do a video on Forward Backward algorithm, Viterbi, and Baum-Welch.....you can explain very well. Thanks in Advance. You are a great teacher, honestly.
Nice suggestion. I'll keep that in mind!
You definitely have a unique and arresting style of representation! Very effective!
Glad you think so!
These videos are so amazing and easy to understand.
3 years later and this video is still amazing! Thank you
this channel will definitely grow soon
Let's hope so :D
Man this is SO interesting I had to prepare for an interview in ML and you saved my life thank you !!!!
But LMAO i wasnt ready for the "i ejaculated" part
for real, this model's down with maximally bad output XD
This was really amazing!! Keeping making more of these math based videos!
That's the plan!
Notes for my future revision:
10:25 "Depending on the starting state, the model will generate a new story"
1. Get the data (text)
2. Create the states (bigram), count the next states and the probability of next state
3. Create a MCM
4. Pick an initial state (bigram) and generate a sequence of next N state (bigram), to form a sentence
It's a probabilistic model, so "Depending on the starting state, the model will generate a new sentence".
In fact, even with the same starting state, the model will generate a new sentence each time it "predict" a sentence.
Great presentation!
One modification needed for increasing number of words in a state equal to more than 2 as mentioned at @6:50 .
Line 3 of "make_markov_model" function should be replaced by this
for i in range(len(words)-2*n_gram+1):
otherwise an error will occur for list out of index when n_gram>=3
Thank you for making these videos and saving lives
That was very interesting, nlp application i have used it before to automate product categorization.but this a whole new level
Same this Markov Chain topic... can you explain by taking rainfall data (as dry and wet 2 states) to generate future rainfall.
These are very good tutorials!
Thanks a lot!
Excellent explanation! I wonder if it would be viable to count certain punctuation symbols (like "." and ",") as 'words' in the model, such that the "game is" state might yield something like "afoot." as one possible transition? That way the model could generate discreet sentences and clauses
Great explanation bro.. Super cool content.. thnx for sharing
Glad you liked it
Yo! This is beautiful, thank you.
Awesome videos. You make things easy that should be easy!
Glad you like them!
Thank you for sharing the code!
amazing work, appreciate you
The videos are very good, Thanks. Maybe consider markov model and HMM videos if possible, it would be personally helpful :)
Guess what...it's already there! HMM: ua-cam.com/video/RWkHJnFj5rY/v-deo.html
wow, it's truly amazing, thank you so much
Did the "story_path" changed ?
I try to play around but it return "number of lines = 0" when I run the 2nd cell.
any open source code for the GANs generating text? Also GANs classifying text as well?
This was an amazing video. Thank you so much!
A question: If we increased the n_gram value for the model, would the quality of the stories also improve significantly?
I think yes, but only to a certain point.
This is amazing, thanks for sharing
truly amazing 🥰🥰
great videos. Excellent job.
Wow this is so amazing!!!
11:42 interesting stuff 😂
This stuff is fascinating
Yeah it is!
its pretty cool, I hope to understand the coding better
Just follow the structure of the dictionaries...
Thanks a lot for makings this as much easy
That was really interesting
excellent
Elementary, my dear Mr Nerd.
Amazing. Keep it up. 🔥🖤
Thanks 🔥
Wow. awesome.
amazing
Were you on neso academy too? Your voice seems familiar.
Nice work
Thank you!
Thank you, Tutor from future
Haha...Thanks!
im getting line 83, in generate_story
next_state = random.choices(list(markov_model[curr_state].keys()),
KeyError: 'dear holmes'
Imagine English not being your first language and having to figure out why your model keeps saying “I ejaculated”
Fellow Human,
The subtitles for this video are currently not viable in English D:
The videos on this series so far had 'em.
I do not know how to help in this matter.
Well, thanks for the previous videos.
"Fellow Human"
talk like an alien
Wouldn't you want to include periods and comma's? Or would that make it too complicated for this example.
Nice point. Probably you can include periods. But commas would be too complicated I feel. Let me know how are the results if you include these.
I'm actually a person from the future
This was uploaded two years ago
Why are the automatically added subtitles in Vietnamese ~~~~
You can always turn off the closed captions.
shouldn't you include the punctuation as part of the word? eg "found." would be different to "found"
That's an interesting idea. I encourage you to try and share the results.
why is the default language of the subtitle is Vietnam? can you please change it to English? NormNerd
where the code at brother
markovbajs
People from the future ... Here😂
🍻🍻
nah this is a toy ai.
Hi man, i tried ur code and i get error on "random.choises", always a KeyError. I tried the code on Colab, jupyter notebook, VsCode and ur Kaggle notebook.. and only on this last one works. Have u tried to run ur code on other notebook ?
same here
Really don't understand your explanations
Very nice project!play the code and change a litlle bit like "n_gram=3" , the storys seemed more readable.
import nltk; nltk.download('all') if you do this locally solves lookup error word_tokenize
Yeah.
Good work
Thank you so much 😀