Also, out of general curiosity, is this recursive sum a general property of Markov processes with reachable terminal states? I'm guessing the arithmetico geometric progression can be reduced to a walk on a linear graph where after d deterministic steps it can either halt or restart from the beginning and the sum corresponds to the expected number of steps.
Tho it's an overkill for expected number of rolls. When you roll a die, you either get 6 with probability of 1/6 or you get something else with probability 5/6, you make one unsuccessful attempt and you are now in a situation where you have the same expected number of rolls. So you get a simple equation for the expected value: E = 1/6*1 + 5/6*(1+E) => E = 1 + 5/6 E => 1/6 E = 1 => E = 6
Essentially, you can calculate the original sum the same way as what you described because it's a recursive formulation S = SUM(1, \infty) [a + (n - 1) * d] * r^{n - 1} S = a + SUM(2, \infty) [a + (n - 1) * d] * r^{n - 1} Now apply the transformation t = n - 1 S = a + SUM(1, \infty) [a + t * d] * r^t Factor out the r term: S = a + r * SUM(1, \infty) [a + t * d] * r^{t - 1} Split the t term in the arithmetic progression into (t - 1) and 1 S = a + r * (SUM(1, \infty) [a + (t - 1)* d] * r^{t - 1}) + r * SUM(1, \infty) d * r^{t - 1} The middle sum is the same as S and the right term is a GP with initial value d and factor r S = a + r * S + [r * d / (1 - r)] Rearranging the S term and dividing by (1 - r) S = a / (1 - r) + r * d / (1 - r)^2
I like the recursive argument - it's nice how it gives some insight into the problem (after 1 unsuccessful roll, we're essentially back to the starting point, plus 1). The only issue here is, there is an underlying assumption that the expectation is finite. It doesn't cause any problems for this example, but we can find an example where the recursive argument doesn't work: E.g. flip a fair coin repeatedly until you get tails, and receive £3^n where n is the number of heads. The expected payoff would seem to satisfy E[X] = 1/2 + 1/2 * 3E[X], which gives E[X] = -1.
Actually, expected number of times to get something of fixed probability P is 1/P. Which for many people "obvious" but one way to prove is what you did.
that geometric distribution has a much easier expected value which is just 1/p, where p is the probability of success, the expected value is 6 anyways.
This unfortunately doesn't help with the partial sums but for the limit one could use the trick of viewing the sum as a derivative. First we split into ar^n and nd r^n, the ar^n part we know how to deal with but the nd r^n part can be seen as r * (d/dr d r^n). From there we can use linearity of derivatives to view the limit sum as r * d * (d/dr) (1/(1-r)) which gives us exactly what we saw in the video.
This would diverge - the geometric part is like 2^n, and you can see that the terms get bigger and bigger, so the sum is infinite. You could use e.g. the ratio test to prove more formally that it diverges.
The last problem. Let k be the expected # of rolls for 6 and it is the same for 1, 2 ,3, 4, 5. So, 1/k is the average occurrence of a single number on one roll. Therefore 6*1/k =1, k=6.
The infinite sum you derived at 9:00 is very similar to the Z transform of the unit step and unit ramp functions 😉. In fact you could work backwards from the Z transform to figure out the infinite sum of all kinds of geometric type series.
You can solve it by identifying a recursive sum as follows: S = SUM(1, \infty) [a + (n - 1) * d] * r^{n - 1} Pull the n = 1 term out of the sum S = a + SUM(2, \infty) [a + (n - 1) * d] * r^{n - 1} Now apply the transformation t = n - 1 S = a + SUM(1, \infty) [a + t * d] * r^t Factor out the r term: S = a + r * SUM(1, \infty) [a + t * d] * r^{t - 1} Split the t term in the arithmetic progression into (t - 1) and 1 S = a + r * (SUM(1, \infty) [a + (t - 1)* d] * r^{t - 1}) + r * SUM(1, \infty) d * r^{t - 1} The middle sum is the same as S and the right term is a GP with initial value d and factor r S = a + r * S + [r * d / (1 - r)] Rearranging the S term and dividing by (1 - r) S = a / (1 - r) + r * d / (1 - r)^2
Nice idea. You need to show that the series is convergent though to prove that the computations are valid (S=1+2+4+8+...+2^n+... satisfies S=1+2S, but obviously S is not equal to -1/2)
The application to probability was rather interesting! This pattern seemed familiar when you introduced it. Another great video!
I just found your videos a few days ago and now I'm totally addicted
I'm glad you're enjoying them!
great video, i recently discovered your content and i love how you very simply explain everything.
I'm glad you're enjoying the videos!
Thank you for another enjoyable and edifying exposition of something I needed to know, and have often been curious about❕
So what you're saying is all arithmetic-geometric series converge to 6
Not really, the limit depends fully on the initial values and common ratios of both G(n) and A(n), in this case A(0)=G(0)=1, d=2, r=1/2.
@@void7366 wooosh
He derives the general formula, then shows that the particular given arithmetico-geometric series converges to 6.
Another lovely exposition and interesting application. Well done. Thank you.
A nice method is to see nr^(n-1) as the derivative of r^n
Also, out of general curiosity, is this recursive sum a general property of Markov processes with reachable terminal states?
I'm guessing the arithmetico geometric progression can be reduced to a walk on a linear graph where after d deterministic steps it can either halt or restart from the beginning and the sum corresponds to the expected number of steps.
Tho it's an overkill for expected number of rolls. When you roll a die, you either get 6 with probability of 1/6 or you get something else with probability 5/6, you make one unsuccessful attempt and you are now in a situation where you have the same expected number of rolls. So you get a simple equation for the expected value:
E = 1/6*1 + 5/6*(1+E)
=> E = 1 + 5/6 E
=> 1/6 E = 1
=> E = 6
Essentially, you can calculate the original sum the same way as what you described because it's a recursive formulation
S = SUM(1, \infty) [a + (n - 1) * d] * r^{n - 1}
S = a + SUM(2, \infty) [a + (n - 1) * d] * r^{n - 1}
Now apply the transformation t = n - 1
S = a + SUM(1, \infty) [a + t * d] * r^t
Factor out the r term:
S = a + r * SUM(1, \infty) [a + t * d] * r^{t - 1}
Split the t term in the arithmetic progression into (t - 1) and 1
S = a + r * (SUM(1, \infty) [a + (t - 1)* d] * r^{t - 1}) + r * SUM(1, \infty) d * r^{t - 1}
The middle sum is the same as S and the right term is a GP with initial value d and factor r
S = a + r * S + [r * d / (1 - r)]
Rearranging the S term and dividing by (1 - r)
S = a / (1 - r) + r * d / (1 - r)^2
I like the recursive argument - it's nice how it gives some insight into the problem (after 1 unsuccessful roll, we're essentially back to the starting point, plus 1). The only issue here is, there is an underlying assumption that the expectation is finite. It doesn't cause any problems for this example, but we can find an example where the recursive argument doesn't work:
E.g. flip a fair coin repeatedly until you get tails, and receive £3^n where n is the number of heads. The expected payoff would seem to satisfy E[X] = 1/2 + 1/2 * 3E[X], which gives E[X] = -1.
@@DrBarker correct, for the specific problem it's defined as a arithmetico-geometric progression which is why it converges.
Actually, expected number of times to get something of fixed probability P is 1/P. Which for many people "obvious" but one way to prove is what you did.
Yes, it's nice to see how we can prove this sort of result by just working with the infinite sum.
Excellent teaching
Nicely done :)
that geometric distribution has a much easier expected value which is just 1/p, where p is the probability of success, the expected value is 6 anyways.
Thanks you for your explaination.
This unfortunately doesn't help with the partial sums but for the limit one could use the trick of viewing the sum as a derivative.
First we split into ar^n and nd r^n, the ar^n part we know how to deal with but the nd r^n part can be seen as r * (d/dr d r^n). From there we can use linearity of derivatives to view the limit sum as r * d * (d/dr) (1/(1-r)) which gives us exactly what we saw in the video.
It works for partial sum as well, but you'd need to calculate a derivative of r*(1-r^n)/(1-r) in the process, so it's messier.
Also, it should be r/(1-r) under derivative for the infinite sum - that sum under derivative starts from n=1, not n=0.
IIUC, you're pulling out the derivative from an infinite series, which isn't always allowed, so you'd also have to justify that
@@decare696 in this case since it's a power series it's always fine as long as the sum converges which it does for |r| < 1
What about the reverse...so 1/1, 2/3, 4/5, 8/7, 16/9,....?
This would diverge - the geometric part is like 2^n, and you can see that the terms get bigger and bigger, so the sum is infinite. You could use e.g. the ratio test to prove more formally that it diverges.
@@DrBarker Sure that will diverge if you go on until infinity but what would the sum be if you have only a finite number of terms?
The last problem. Let k be the expected # of rolls for 6 and it is the same for 1, 2 ,3, 4, 5. So, 1/k is the average occurrence of a single number on one roll. Therefore 6*1/k =1, k=6.
Another solution will be k = 1 + 5/6*k.
He knows that if he smiles to the camera he loses the game.
very cool
when you wrote 6 for the last problem I face palmed.
ya. it is 6 all the way down.
@@surrealbits I meant as in it should be expected to be 6 times since there are 6 sides, so the answer is somewhat anticlimactic
4:01 I'm a grown up man I'm a grown up man I'M A GROWN UP MAN
S(x)= Σ(from k=0 to ∞) x^(2k+1)= x+x^3 +x^5+...= x/(1-x^2) ; x^2
The infinite sum you derived at 9:00 is very similar to the Z transform of the unit step and unit ramp functions 😉. In fact you could work backwards from the Z transform to figure out the infinite sum of all kinds of geometric type series.
Σ
You can solve it by identifying a recursive sum as follows:
S = SUM(1, \infty) [a + (n - 1) * d] * r^{n - 1}
Pull the n = 1 term out of the sum
S = a + SUM(2, \infty) [a + (n - 1) * d] * r^{n - 1}
Now apply the transformation t = n - 1
S = a + SUM(1, \infty) [a + t * d] * r^t
Factor out the r term:
S = a + r * SUM(1, \infty) [a + t * d] * r^{t - 1}
Split the t term in the arithmetic progression into (t - 1) and 1
S = a + r * (SUM(1, \infty) [a + (t - 1)* d] * r^{t - 1}) + r * SUM(1, \infty) d * r^{t - 1}
The middle sum is the same as S and the right term is a GP with initial value d and factor r
S = a + r * S + [r * d / (1 - r)]
Rearranging the S term and dividing by (1 - r)
S = a / (1 - r) + r * d / (1 - r)^2
Nice idea. You need to show that the series is convergent though to prove that the computations are valid (S=1+2+4+8+...+2^n+... satisfies S=1+2S, but obviously S is not equal to -1/2)
@@TaladrisKpop if r < 1, the sum always converges, which is the implicit assumption made in the video as well.
@Rishi Rajasekaran Not at all. The video computes the partial sums and then their limits. So the convergence is taken care of.
@Rishi Rajasekaran The convergence is easy to check (Ratio Test for example) so, as I said before, nice method