Markov Chains and Text Generation

Поділитися
Вставка
  • Опубліковано 16 гру 2024

КОМЕНТАРІ • 33

  • @sheikhakbar2067
    @sheikhakbar2067 4 роки тому +2

    This guy is genius.. combining theory with application. Worth subscribing to.

  • @JakobRobert00
    @JakobRobert00 7 років тому +8

    Really nice explanation, great video
    what I would like to add to 10:18 is, that using longer word sequences also has disadvantages. the longer the sequences, the less likely it is for them to appear in the training text several times followed by different words. word sequences that occur rarely would have a probability of 100% for a specific following word, and so in the generated text it is totally deterministic by which word this sequence is followed. the longer the word sequences are, the more sense the generated text makes, but as well it loses its variability and in the extreme it will just be exactly like the training text.

  • @Djangokillen
    @Djangokillen 8 років тому +11

    Amazing video and very well explained. I've been interested in Markov chains for a while now, and this was really good!

  • @mubashiriqbal07
    @mubashiriqbal07 2 роки тому

    Woow, Nice explanation. Thank you i really learn about markov chains from this video.

  • @codexhammered007
    @codexhammered007 6 років тому

    You are so cool mate. Great Work. The world needs more and more creative people like you.

  • @vatsal_gamit
    @vatsal_gamit 4 роки тому

    Really amazing and very well explained!! Cheers from India

  • @IamGretar
    @IamGretar 8 років тому +4

    Very smart! Keep the videos coming I enjoy them a lot, very interesting stuff!

  • @yeahorightbro
    @yeahorightbro 8 років тому

    Loving the direction your videos are taking mate. Good job!

  • @giantneuralnetwork
    @giantneuralnetwork 8 років тому +1

    Cool explanation! Really appreciated the graphics that went along with it. An interesting addition to markov chains comes with including 'actions' to get markov decision processes. Where performing an action changes the probability of going from one state to the next. Throw in some rewards for state transitions and you've nearly hit reinforcement learning territory!

  • @Hazit90
    @Hazit90 7 років тому +3

    Nice little ending

  • @sheikhakbar2067
    @sheikhakbar2067 4 роки тому +1

    I am wondering if you teach an online course related to this topic on UDEMY or Cousera? Truly I like your way of explaining things!

  • @noelofficiallyoffice8580
    @noelofficiallyoffice8580 3 роки тому +1

    Thats pretty cool! Is there a way this could work for musical patterns as well? How would you be able to integrate the music into the program?

  • @tomdecock6027
    @tomdecock6027 8 років тому +3

    great video as always !

  • @MengSunpure
    @MengSunpure 7 років тому +1

    for the last example with pen strokes. if you had made it non-markov with an LSTM you can maybe really generate convincing characters. very fun idea!!

    • @macheads101
      @macheads101  7 років тому

      Alex Graves did this in a really interesting paper. The results are cool! arxiv.org/abs/1308.0850 (see page 25).

  • @lucav4045
    @lucav4045 6 років тому

    This is actually amazing! Take my like & subscription!

  • @vardhan254
    @vardhan254 8 місяців тому

    great video !

  • @donellessame9194
    @donellessame9194 Рік тому

    How do you generate the first word? Isn't there an initial distribution ?!

  • @ganaraminukshuk0
    @ganaraminukshuk0 6 років тому +1

    Hold on, let me parse that phrase...
    "Pitchers can also refer to translation to machine code" => Pitchers (the vessels) or pitchers (the plants) are another way of saying that we're compiling source (assembly language, probably) into the machine code that goes into an executable file. In short, pitcher plants can also be used for compiling source code.
    "For correctly rounded when converted to Islam in the material" => But if you want proper rounding (like with floating-point numbers), the source code needs to be converted to Islam first, not MIDI or PDF. (Refer to the meme where searching "how do i convert to" leads to Christianity, Islam, and PDF as the next word in the sequence; I hear predictive text is based on some sort of Markov chain, so that's interesting.)
    If you want to pitcher your code so that it works best with floating point numbers, you need to convert your source files to Islam first.

  • @houkensjtu
    @houkensjtu 8 років тому +2

    Will you consider to do some GO lang tutorials? I think Go is rather new and still not many learning materials out there. It will also be great for people who want to understand your code.

    • @macheads101
      @macheads101  8 років тому

      This is a great idea. Since I have been doing all my code in Go, I think it would be fitting for me to make some tutorials on it. :)

    • @Ti-wy1fg
      @Ti-wy1fg 8 років тому

      what's ur reason for choosing Go? Especially considering using it for machine learning.

    • @macheads101
      @macheads101  8 років тому

      For the past few years I have used Go and JavaScript for virtually everything. Thus, I wrote this ML stuff in Go so that, if I made some useful machine learning APIs, I could easily use them natively from other Go programs. I can also use my Go stuff on the web using GopherJS, which is a fairly reliable and fast way to compile Go to JavaScript.

    • @houkensjtu
      @houkensjtu 8 років тому

      For the time being, it may still be a problem of personal taste. (Especially when talking about building programs with rather small scale.)
      But that's also why I hope macheads101 could do it: still very few programmers would code these things in GO but Alex did. It should be interesting to see why he likes GO, and how he coded everything in GO.

  • @javiersospedra9639
    @javiersospedra9639 5 років тому

    wow, incredible, one question, how do you get all this wikipedia stuff for the computer learning it?? thank you

    • @williammrs
      @williammrs 5 років тому

      I suppose downloading the html and using a script to extract the tags. or using a bot to datamine the sites. I guess there's a tool for it somewhere online.

  • @lonnybulldozer8426
    @lonnybulldozer8426 3 роки тому

    You still active?

  • @Schmuck
    @Schmuck 7 років тому

    11:25 You say you "trained" a markov chain, I'm not being critical, just asking for clarification. Do you mean you just built a markov chain? When I hear the word train, I think of backprop and gradients

    • @MengSunpure
      @MengSunpure 7 років тому +1

      training is used in a very loose sense. Anytime a model with parameters gets its parameters set to a set of values based on some dataset it is training loosely speaking. Making a n-gram model is also "training".

  • @DragonGiveMeFire
    @DragonGiveMeFire 2 роки тому

    I put 500th like

  • @emmadzahid1555
    @emmadzahid1555 7 років тому

    +1 subscriber

  • @garryyegor9008
    @garryyegor9008 6 років тому

    converted to islam