Markov Chains: Generating Sherlock Holmes Stories | Part - 4

Поділитися
Вставка
  • Опубліковано 16 гру 2024

КОМЕНТАРІ • 97

  • @darnell8897
    @darnell8897 4 роки тому +108

    I never would have imagined that something so powerful could be implemented in so little code; or explained so succintly. Fantastic video.

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому +4

      Thanks a lot!

    • @eliotduromuni419
      @eliotduromuni419 2 роки тому +8

      If the code is short, it's because there is a lot of abstraction. Just imagine if you had to rewrite all the built-in functions...

  • @louis9116
    @louis9116 3 роки тому +11

    Giving a hands-on practical example to complement the theory was just icing on the cake. Thank you so much

  • @jiangxu3895
    @jiangxu3895 8 місяців тому +3

    This markov chain series get fewer and fewer views as the series continues, but the up to down ratio maintains. Hope I can make it to the last one. Thumb up!!
    Update after watching: I initially tried to learn the Markov Chains because of my learning of simulation of thermal dynamics need this knowlege, but after viewing this episode, I learned how the Markov Chains model was generated in natural language processing. Given this topic is so hot recently, thank you Sir for your work!!

  • @gameboardgames
    @gameboardgames 2 роки тому +9

    The final random generations are really terrific! This one is quite Zen and appropriate to itself as being generated via semi random function: "I would call your cipher was not difficult for me to point that way Holmes shook his head.' Great video! You explain these complicated things at just the right pace to follow along, but not miss anything important in their summary or simplification for explaination. Thanks!

  • @JustinMasayda
    @JustinMasayda 2 роки тому +9

    I'm not certain about this but you might be able to simplify your dictionary logic if you use a Counter data type instead of manually checking for existing keys and incrementing them. Excellent video!

  • @xenonchikmaxxx
    @xenonchikmaxxx 3 роки тому +2

    The most amazing thing about this idea is that it is really simple and effective!

  • @wonseoklee80
    @wonseoklee80 2 роки тому +2

    Everything is really well put in this video! I don’t need any pencil, paper, keyboard or whayever to learn a useful example of Markov model. Nice work.

  • @godekdominik2678
    @godekdominik2678 10 місяців тому +1

    Amazing! Now I understand the basics. I hope when I will try to implement this myself I will gain a full understanding of what is going on.

  • @utkusaglam4875
    @utkusaglam4875 2 роки тому +1

    These Markov chain videos are amazing. Thanks mate.

  • @saeedmaisallah6113
    @saeedmaisallah6113 3 роки тому +5

    Please do a video on Forward Backward algorithm, Viterbi, and Baum-Welch.....you can explain very well. Thanks in Advance. You are a great teacher, honestly.

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +2

      Nice suggestion. I'll keep that in mind!

  • @iglesiaszorro297
    @iglesiaszorro297 3 роки тому +1

    You definitely have a unique and arresting style of representation! Very effective!

  • @ruchitkini6889
    @ruchitkini6889 2 роки тому +1

    These videos are so amazing and easy to understand.

  • @theodoreshachtman9990
    @theodoreshachtman9990 8 місяців тому +1

    3 years later and this video is still amazing! Thank you

  • @angitd3766
    @angitd3766 3 роки тому +2

    this channel will definitely grow soon

  • @Oromiss78
    @Oromiss78 3 роки тому +9

    Man this is SO interesting I had to prepare for an interview in ML and you saved my life thank you !!!!
    But LMAO i wasnt ready for the "i ejaculated" part

    • @aleksszukovskis2074
      @aleksszukovskis2074 7 місяців тому

      for real, this model's down with maximally bad output XD

  • @shantanudash7
    @shantanudash7 3 роки тому +2

    This was really amazing!! Keeping making more of these math based videos!

  • @karannchew2534
    @karannchew2534 2 роки тому +1

    Notes for my future revision:
    10:25 "Depending on the starting state, the model will generate a new story"
    1. Get the data (text)
    2. Create the states (bigram), count the next states and the probability of next state
    3. Create a MCM
    4. Pick an initial state (bigram) and generate a sequence of next N state (bigram), to form a sentence
    It's a probabilistic model, so "Depending on the starting state, the model will generate a new sentence".
    In fact, even with the same starting state, the model will generate a new sentence each time it "predict" a sentence.

  • @Frazorr3363
    @Frazorr3363 Рік тому +1

    Great presentation!
    One modification needed for increasing number of words in a state equal to more than 2 as mentioned at @6:50 .
    Line 3 of "make_markov_model" function should be replaced by this
    for i in range(len(words)-2*n_gram+1):
    otherwise an error will occur for list out of index when n_gram>=3

  • @lenaxdeng3202
    @lenaxdeng3202 Рік тому

    Thank you for making these videos and saving lives

  • @fadiibrahim1036
    @fadiibrahim1036 2 роки тому

    That was very interesting, nlp application i have used it before to automate product categorization.but this a whole new level

  • @msagar91
    @msagar91 3 роки тому +2

    Same this Markov Chain topic... can you explain by taking rainfall data (as dry and wet 2 states) to generate future rainfall.

  • @popupexistence9253
    @popupexistence9253 4 роки тому +3

    These are very good tutorials!

  • @justadude7455
    @justadude7455 6 місяців тому

    Excellent explanation! I wonder if it would be viable to count certain punctuation symbols (like "." and ",") as 'words' in the model, such that the "game is" state might yield something like "afoot." as one possible transition? That way the model could generate discreet sentences and clauses

  • @rahultripathi2024
    @rahultripathi2024 3 роки тому +1

    Great explanation bro.. Super cool content.. thnx for sharing

  • @Paranorman2049
    @Paranorman2049 4 місяці тому

    Yo! This is beautiful, thank you.

  • @eventhisidistaken
    @eventhisidistaken 3 роки тому

    Awesome videos. You make things easy that should be easy!

  • @ThatNiceDutchGuy
    @ThatNiceDutchGuy 2 роки тому

    Thank you for sharing the code!

  • @manishkumartailor9389
    @manishkumartailor9389 7 місяців тому

    amazing work, appreciate you

  • @rcrimi08
    @rcrimi08 3 роки тому +2

    The videos are very good, Thanks. Maybe consider markov model and HMM videos if possible, it would be personally helpful :)

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +2

      Guess what...it's already there! HMM: ua-cam.com/video/RWkHJnFj5rY/v-deo.html

  • @mhhmm__
    @mhhmm__ Рік тому

    wow, it's truly amazing, thank you so much

  • @alif_ni
    @alif_ni Рік тому

    Did the "story_path" changed ?
    I try to play around but it return "number of lines = 0" when I run the 2nd cell.

  • @pr0fess0rop18
    @pr0fess0rop18 10 місяців тому

    any open source code for the GANs generating text? Also GANs classifying text as well?

  • @mozilla012
    @mozilla012 3 місяці тому

    This was an amazing video. Thank you so much!
    A question: If we increased the n_gram value for the model, would the quality of the stories also improve significantly?

    • @NormalizedNerd
      @NormalizedNerd  3 місяці тому +1

      I think yes, but only to a certain point.

  • @Sebastian-ok5wr
    @Sebastian-ok5wr 2 роки тому

    This is amazing, thanks for sharing

  • @SaurabhKumar-zu4wm
    @SaurabhKumar-zu4wm Рік тому

    truly amazing 🥰🥰

  • @rezNezami
    @rezNezami Рік тому

    great videos. Excellent job.

  • @djgyanzz
    @djgyanzz 2 роки тому

    Wow this is so amazing!!!

  • @sunnychaturvedi9958
    @sunnychaturvedi9958 2 роки тому

    11:42 interesting stuff 😂

  • @AndrewSego
    @AndrewSego 4 роки тому

    This stuff is fascinating

  • @johnmandrake8829
    @johnmandrake8829 4 роки тому

    its pretty cool, I hope to understand the coding better

    • @NormalizedNerd
      @NormalizedNerd  4 роки тому

      Just follow the structure of the dictionaries...

  • @pawangunjan3535
    @pawangunjan3535 Рік тому

    Thanks a lot for makings this as much easy

  • @vinaybhujbal4347
    @vinaybhujbal4347 Рік тому

    That was really interesting

  • @yxu2973
    @yxu2973 Рік тому

    excellent

  • @gameboardgames
    @gameboardgames 2 роки тому

    Elementary, my dear Mr Nerd.

  • @jyoti9426
    @jyoti9426 4 роки тому

    Amazing. Keep it up. 🔥🖤

  • @sedembuabassah1598
    @sedembuabassah1598 4 місяці тому

    Wow. awesome.

  • @shivkrishnajaiswal8394
    @shivkrishnajaiswal8394 2 роки тому

    amazing

  • @NehaKariya-d1f
    @NehaKariya-d1f 2 місяці тому

    Were you on neso academy too? Your voice seems familiar.

  • @vaishaknarayanan7321
    @vaishaknarayanan7321 4 роки тому

    Nice work

  • @arkadipbasu2348
    @arkadipbasu2348 3 роки тому

    Thank you, Tutor from future

  • @WisdomShortvids
    @WisdomShortvids Рік тому

    im getting line 83, in generate_story
    next_state = random.choices(list(markov_model[curr_state].keys()),
    KeyError: 'dear holmes'

  • @ytjoemoore94
    @ytjoemoore94 Місяць тому

    Imagine English not being your first language and having to figure out why your model keeps saying “I ejaculated”

  • @onenhere6458
    @onenhere6458 2 роки тому

    Fellow Human,
    The subtitles for this video are currently not viable in English D:
    The videos on this series so far had 'em.
    I do not know how to help in this matter.
    Well, thanks for the previous videos.

  • @SJ23982398
    @SJ23982398 3 роки тому

    Wouldn't you want to include periods and comma's? Or would that make it too complicated for this example.

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому +1

      Nice point. Probably you can include periods. But commas would be too complicated I feel. Let me know how are the results if you include these.

  • @easylemon6640
    @easylemon6640 Рік тому

    I'm actually a person from the future
    This was uploaded two years ago

  • @user-kf9lv6cg5b
    @user-kf9lv6cg5b 3 роки тому

    Why are the automatically added subtitles in Vietnamese ~~~~

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому

      You can always turn off the closed captions.

  • @raresaturn
    @raresaturn 3 роки тому

    shouldn't you include the punctuation as part of the word? eg "found." would be different to "found"

    • @NormalizedNerd
      @NormalizedNerd  3 роки тому

      That's an interesting idea. I encourage you to try and share the results.

  • @唐伟祚-j4v
    @唐伟祚-j4v Рік тому

    why is the default language of the subtitle is Vietnam? can you please change it to English? NormNerd

  • @Boxing_fiend
    @Boxing_fiend Рік тому

    where the code at brother

  • @azztereris
    @azztereris 2 роки тому

    markovbajs

  • @PsynideNeel
    @PsynideNeel 4 роки тому

    People from the future ... Here😂

  • @weekipi5813
    @weekipi5813 4 місяці тому

    nah this is a toy ai.

  • @Inqwisitor
    @Inqwisitor 3 роки тому +1

    Hi man, i tried ur code and i get error on "random.choises", always a KeyError. I tried the code on Colab, jupyter notebook, VsCode and ur Kaggle notebook.. and only on this last one works. Have u tried to run ur code on other notebook ?

  • @kshitijshekhar1144
    @kshitijshekhar1144 Рік тому

    Really don't understand your explanations

  • @hedyhi-in1xu
    @hedyhi-in1xu Рік тому

    Very nice project!play the code and change a litlle bit like "n_gram=3" , the storys seemed more readable.

  • @davidfamilian139
    @davidfamilian139 2 роки тому +1

    import nltk; nltk.download('all') if you do this locally solves lookup error word_tokenize

  • @heisenberg_3995
    @heisenberg_3995 4 роки тому +1

    Good work