Thanks. The MIT website provides pdfs of the prepared slides. So the student can see a clear pdf side by side to the video on their computer. The pdf file can be downloaded. Window placement can be side by side. The State ID Numbers (IDs)are absent when I view the pdfs for this lecture. The Professor John Tsitsiklis does refer to the IDs. Some PDF readers including Adobe products can annotate with a text box the downloaded pdf. You can add the same IDs fresh in your copy of the document. The annotation can make the lecture easier to follow.
I do think that calculating the eigenvectors of the system is an easier way to deal with figuring out the space of the steady space and the rate at which convergence happens. Definitely worth examining as a comprehension supplement. Basically, the space defined by the eigenvectors of eigenvalue 1 is the steady space, and the largest eigenvalue not equal to 1 is inversely proportional to the rate of convergence.
When we lump more than 1 state into a class, and find the probability of entering the lump or the time needed to reach that lump: how do we calculate the values for individual states within the lump?
A lump is a collection of classes. You should distinguish a lump from a recurrent/transient class. The lumping is useful in our 'expected time until absorption' example, but not for the 'probability of eventually entering a certain recurrent class' example (if we lumped here, we'd always get probability 1).
In the previous example, when we were at state 2, we directly took 0.2 as the probability to reach state 4. But here at 44:08, why are we using the u2 value?
may I ask how was 106 phone lines computed? I tried setting pi_b = 0.01 and substituted some numbers to calculate the i needed using the equation at 30:15, but seems the RHS of the equation is getting larger with larger i...
This is incredible. First time seeing Markov Chains, and I can't believe I never took the time to check it out. Plugging in the Poisson blew my mind.
The lecture is marvelous. Thanks, professor John Tsitsiklis.
Thanks. The MIT website provides pdfs of the prepared slides. So the student can see a clear pdf side by side to the video on their computer. The pdf file can be downloaded. Window placement can be side by side. The State ID Numbers (IDs)are absent when I view the pdfs for this lecture. The Professor John Tsitsiklis does refer to the IDs. Some PDF readers including Adobe products can annotate with a text box the downloaded pdf. You can add the same IDs fresh in your copy of the document. The annotation can make the lecture easier to follow.
I do think that calculating the eigenvectors of the system is an easier way to deal with figuring out the space of the steady space and the rate at which convergence happens. Definitely worth examining as a comprehension supplement. Basically, the space defined by the eigenvectors of eigenvalue 1 is the steady space, and the largest eigenvalue not equal to 1 is inversely proportional to the rate of convergence.
I agree
It's not easy when the transition matrix is large ...but may be it is easier for computer..
All the other lectures were easier to grasp. This was probably the toughest to grasp.
When we lump more than 1 state into a class, and find the probability of entering the lump or the time needed to reach that lump: how do we calculate the values for individual states within the lump?
A lump is a collection of classes. You should distinguish a lump from a recurrent/transient class. The lumping is useful in our 'expected time until absorption' example, but not for the 'probability of eventually entering a certain recurrent class' example (if we lumped here, we'd always get probability 1).
In the previous example, when we were at state 2, we directly took 0.2 as the probability to reach state 4. But here at 44:08, why are we using the u2 value?
I think you are mistaken; in the previous example a2 is equal to 0.2 + 0.8 * a1 (not just 0.2)
may I ask how was 106 phone lines computed? I tried setting pi_b = 0.01 and substituted some numbers to calculate the i needed using the equation at 30:15, but seems the RHS of the equation is getting larger with larger i...
That's really helpful!! Thanks!!
Steve Jobs should've watched (23:25). At the dawn of new phone era, it is destined to use iPhone.
You inspired me.