Mod-01 Lec-38 Hidden Markov Model

Поділитися
Вставка
  • Опубліковано 3 жов 2024

КОМЕНТАРІ • 44

  • @akino.3192
    @akino.3192 6 років тому

    Well done Professor Biswas for breaking down a fairly complicated concept into bite-sized chunks.

  • @hariprasadyalla
    @hariprasadyalla 6 років тому +3

    Excellent video. This is my first peek into HMM.

  • @12sandy345
    @12sandy345 7 років тому +2

    Brilliant Lecture. Blown away by the meticulousness. Thank you, Sir.

    • @nikhilbalwani5556
      @nikhilbalwani5556 5 років тому

      Expect all IIT lectures to be like this one.

    • @nikhilbalwani5556
      @nikhilbalwani5556 4 роки тому

      @@nahid862 here you go: ua-cam.com/video/G-77ZgsGjvA/v-deo.html

  • @ghost70639
    @ghost70639 9 років тому +8

    thanx a lot professor , it helps me a lot , i think studying in india is a good idea

    • @rteja764
      @rteja764 7 років тому

      Only if your are interest in hard core theory oriented subjects (like pure maths, theoretical physics).

    • @nikhiljahagirdar2724
      @nikhiljahagirdar2724 5 років тому

      Too much competition bro.. Getting into a good university is almost impossible in India

    • @shashankesh
      @shashankesh 4 роки тому

      @@nikhiljahagirdar2724 depends on possibility you want to have

    • @anonymous9217w2
      @anonymous9217w2 5 місяців тому

      @@nikhiljahagirdar2724 maybe you are dumb.

  • @Areeva2407
    @Areeva2407 4 роки тому +1

    BEST EXPLAINATION ON ENTIRE WEB

  • @aruntakhur
    @aruntakhur 7 років тому +3

    Thanks Prof. This is superb explanation.

  • @bnglr
    @bnglr 4 роки тому

    very chilling video to watch. thanks Professor

  • @thsusma5550
    @thsusma5550 4 роки тому

    Best lecture ever. Thank you Sir.

  • @karunakarallugunti2033
    @karunakarallugunti2033 2 роки тому

    Excellent explanation sir...

  • @sadafmirza559
    @sadafmirza559 6 років тому +1

    For the algorithm, he hasn't incremented through the values of 'j' at all.If we compute this algorithm for any state other than the final state, the algorithm will never return. So there should've been an inner loop within the for loop to increment through all values of 'j'.

  • @jofrasavi85
    @jofrasavi85 7 років тому +1

    thank you very much, very good explanation.

  • @PyMoondra
    @PyMoondra 5 років тому +1

    At 54 mins near the end, when dealing with the forward algorithm, professor doesn't iterate through all the js. Shouldn't he iterating through all the j within the middle states as any of those states can emit our observation.
    Thank you.

  • @hariprasadyalla
    @hariprasadyalla 6 років тому +1

    At 50:33, to calculate probability of being in invisible state j at time step t. Ie., alpha(j, t) calculation should not multiply with b(j, k). Would that not make calculation of alpha(i, t+1) wrong? Probability of emitting expected states(symbols) at each step should be separately tracked.

    • @hariprasadyalla
      @hariprasadyalla 6 років тому +1

      No. I stand corrected. Thinking little deeply into it. alpha(j, t) is not just probability of being in invisible state j. It is probability of being in invisible state j and also producing sequence of visible states (symbols) upto time step t.

  • @ahmedkowsher7657
    @ahmedkowsher7657 4 роки тому

    love this tutorial, thank you so much

  • @kabitadhara
    @kabitadhara 7 років тому

    Thanks a lot Prof Biswas.

  • @samujjaldas4104
    @samujjaldas4104 7 років тому

    Too good explanation!

  • @imanbio
    @imanbio 9 років тому

    Great upload,
    Thank you very much

  • @d.s.parihar4792
    @d.s.parihar4792 7 років тому

    I have a seismic signal of 11900 samples, given sampling rate is 1000 samples per sec. How to apply this model to this signal provided I have the complete understanding of HMM theory.

  • @Areeva2407
    @Areeva2407 4 роки тому

    VERY GOOD SIR

  • @vivekam101
    @vivekam101 5 років тому

    Excellent

  • @niharikaepuri3305
    @niharikaepuri3305 8 років тому +2

    What is an accepting state ? and why do we need it ?

    • @uditarpit
      @uditarpit 7 років тому +2

      where automata (state machine stops), not related to HMM, though a related concept as HMM is also a state machine.

    • @saiveeranki
      @saiveeranki 4 роки тому

      Final state

  • @anupalone9886
    @anupalone9886 6 років тому

    sir can u break this video for decoding and leaning problem separately

  • @uditarpit
    @uditarpit 2 роки тому

    revising after 4 years. :)

  • @himanshukandwal5373
    @himanshukandwal5373 9 років тому

    excellent !

  • @reardelt
    @reardelt 9 років тому +1

    Mistake at 37:10. It should start from t=0 and not t=1 as shown by the lecturer. If t=1, then that is saying, the starting node does not emit anything.

    • @anandsaha01
      @anandsaha01 7 років тому

      There is no mistake. The state at t=0 does not emit any visible symbol. It transitions into a hidden state (only). So v(0)*w(0) doesn't make sense.

    • @sadafmirza559
      @sadafmirza559 6 років тому

      the visible states/symbols are emitted only after a transition, so the machine does transition to w(1) at t=0, but the first visible state/symbol v(1) is emitted only after the first transition itself i.e at t=1.
      At t=0, the machine hasn't made any transitions yet, therefore no visible states/symbols are emitted. In other words the starting node will emit a symbol only when a transition is made to it from any other hidden state(including the state itself).

  • @bhupalchidambar3940
    @bhupalchidambar3940 6 років тому

    Brilliant

  • @shrutiv1471
    @shrutiv1471 4 роки тому

    👌

  • @Dwig108
    @Dwig108 6 років тому

    Thanks

  • @ameymeher9880
    @ameymeher9880 5 років тому

    Had asked a doubt regarding the topic discussed , question is posted in the link below
    cs.stackexchange.com/questions/109368/prior-probability-in-hmm

    • @saikumarjoru5194
      @saikumarjoru5194 4 роки тому

      I did not understand.Can you elaborate that (0.5)^2 part

  • @berkgur868
    @berkgur868 5 років тому

    profesor may i kiss you please?

  • @radhikamaheshwari1433
    @radhikamaheshwari1433 5 років тому

    pahle khud ache se par