It would have been helpful if the actual problem were quoted, not this paraphrase. In the statement, it is made clear that "finite game" means "ends in a finite number of tosses with probability 1".
Yea there might be some misconception without a background in probability. Just want to point out that it is known in the field that if the probability of an event tends to 1 as t->infinity, it will (almost surely) happen in finite time
What you’re asking for is a bounded-length game, and there is no such game (all probabilities are rational with power-of-2 denominators). The game ends with probability 1, which is good enough for the problem, I guess
@@noahtaul I've just found the original problem. It specifies that a finite game ends in finite number of moves with probability 1. So the solution in the video indeed satisfies the condition.
@@afseraph Which to me seems like a non-obvious and esoteric way to define a finite game. I would define it as follows: "Given a game, G, we say that G is a finite game IFF there exists a natural number n such that G always ends in fewer than n moves."
@@chongli297 No, that's a bounded game. And as Noah pointed out, clearly a bounded game with finitely many options each move has only a finite number of outcomes, so a probability of one outcome cannot be irrational (since the coin is fair). But a game _can_ be unbounded yet finite. For example, imagine a version of Nim where the first player states a natural number, and then the second player starts playing the game with that many objects. That game will always end in finitely many moves, but there is no upper bound on how long it could go. Similarly, in the game of infinite chess (chess played on the integer lattice), there are positions where white can always mate black against perfect play, but black can delay mate by an unbounded number of moves (which is nevertheless always finite). The game of infinite chess itself is not usually finite, but restricted versions to certain positions can be, even though their length may be unbounded. An example of a non-finite game is baseball. Not only is the length of a baseball game unbounded, in principle it could last for all of eternity. It isn't guaranteed to end at all. And this game described here is like that. So it's weird to call it a finite game. But apparently the problem statement didn't just say "finite game" but actually said that it just had to end with probability 1, which this one does (like baseball).
More interestingly, one can prove the *infinite product for sin(x)/x* using this. Note that this produces countably many independent rvs on the same probability space (without invoking Kolmogorov's extension theorem, so this construction is interesting for that reason too!), namely the i-th digit (flip) in the binary expansion of the number you get, say Y_i, is ~Ber(1/2) and since the number you get is uniformly distributed in [0,1] it tells us that U_{(0,1)} = sum Y_i/2^i In the sense that these 2 random variables have the _same_ distribution, so in particular, the same *characteristic function* so calculating characteristic functions on both sides, by independence of the Y_i (and some dominated convergence theorem justification), we get the infinite product formula for sin(z)/z for any complex number z !!!!!
A simpler formulation: flip the coin until you get heads, on the k-th flip. Then you win if p_k = 1, otherwise you lose. This produces the same sum of p_k/2^k for the overall win probability but it's much more straightforward to calculate (don't need any of the branches, etc.)
And here's a shorter way to phrase the video answer: "Let's say heads=1 and tails=0. If you write alpha in binary and use the mapping to coin flips, flip the coin until right at the first place alpha diverges from *your* coin-flip sequence. You win if on this flip it's your coin, not alpha, that said tails regardless of what alpha is."
The game is almost surely finite, not excatly finite. For the game to go indefinitely, it needs to reproduce exatly the same digits as alpha which has probabilty 0.
Here is a properly finite game, taking only one toss: Draw a straight arrow on both sides of the fair coin and toss it. Measure the angle between the direction the arrow points to and an arbitrarily predefined vertical direction. If the angle is smaller than 1 radian, you win.
Wouldn't this only provide the value for a single α? Actually, I guess we can just change the 1 radian to f(α) radians to make it more encompassing for some f.
@@xinpingdonohoe3978It would: the rules say "if the angles are less than 1 radian apart, you win," so that was specifically for the thumbnail of alpha*pi=1.
Yeah, I thought the answer was immediate. To win with probability x, pick a number uniformly in (0,1) and win if your picked number is less than x. And a sequence of fair coin flips is just a uniformly random binary expansion. And if your sequence is not equal to x, you will always discover that fact after finitely many digits, so the game is finite in all but one case, thus with probability 1.
Take a disk of radius 1 and draw lines from the center to the circumference, one at zero degrees, one at 2*pi*alpha degrees. Throw a coin on the disk, if it lands in the region between 0 and 2*pi*alpha degrees, you win.
It's actually quite obvious if you think about it. The setup basically tells us to choose a uniformly random number between 0 and 1 (represented by binary notation from coin flips) and so the odd of choosing a number less than alpha is precisely alpha.
No, unless the string you build from flips turns out to be exactly equal to α, you can always tell if it is greater than or less than α after finitely many flips. Because in order for them not to be equal, they must differ at some finite position, and once they differ, you can immediately tell which is greater. But that does mean that 0% of the time, your flips will exactly match α, and the game will not have a winner. However, the problem apparently only requires the game to end with probability 1.
It seems to me that this is not a FINITE game - look at the infinity symbol at 15:00 in the sum. A FINITE game could only include a finite number of binary digits, which means truncating α to some particular length.
Anybody already seen the MindYourDecisions on Putnam 1989-A4? I did, so I already know this one! Here's a more direct quote of the question, rephrased so winning probability is alpha and not p: "You have a fair coin, i.e. heads and tails are equally likely. You want to win a game with any probability alpha strictly between 0 and 1 where the chance of a tie is exactly 0 and it only takes a finite number of flips to decide. How do you write the rules to set this up?" Here are 2 ways Presh says you can do it, both after writing alpha in binary. 1. First, flip the coin until your first heads on toss n. If and only if a_n=1 do you win. 2. (I think this is the one described in the video, but I might have swapped heads and tails) For each flip, have heads=1 and tails=0. Keep flipping the coin until the first time flip k=/=a_k. You win exactly if a_k=1.
Question: why can’t the game be flip a fair coin, compute the fraction of heads which is a random variable, then pass that variable through nonlinear transformation that maps 1/2 to 1 onto a irrational number?
Then you would need to define the non-linear transformation to do anything useful with it. It is much more convenient to use binary as it converts any sequences of heads and tails to a unique number between 0 and 1
Could you not just construct a non repeating string of binary digits like 100111000011111... and say that if you overshoot this string at any time you win and if you undershoot you lose?
but don't we need to make a proof of α being an irrational? maybe this sum always converges to a rational (i don't think it's like that, but it should be prooved imho)
Okay - knee jerk reaction and it is probably wrong. And what are the chances of that? What we have done is create an algebra and a means of counting finitely many quotients that return an irrational number? (!) In other words summing up finitely many rationals to get an irrational number. (Hastily looking for a maclaurin expansion ... Taylor expansion ... ) Yup - we have just summed up finitely many rationals to get an irrational - shrug - Basis: alpha is irrational The chance of carrying forward to a closing situation is finite Summing finitely many winning options with probability a quotient in range (0, 1) will always return a rational number but alpha is irrational Unless, of course, the number of trials extends to infinity in which stranger things may happen. EDIT: still scratching my head a little bit and sorta have to affirm that only way an irrational can arise from sum of rationals is at infinity
Our number is not equal to alpha after a finite amount of moves, the problem requires the game to end after a finite number of moves so we have just ensured that our number will differ from alpha after a finite number of moves. In fact for our number to be exactly equal to alpha we would need an infinite number of moves but the probability of that happening is zero, because for our number to match to alpha upto n digits we need the sequence a1a2a3...an. So the number of favourable outcomes= 1 and the number of total outcomes= 2^n. So the probability of matching n digits of alpha is 1/2^n and as n-> infinity the probability convergence to 0. In other words the probability of the game never ending= 0 thus this is a finite game. I hope you understood my explanation.
It would have been helpful if the actual problem were quoted, not this paraphrase. In the statement, it is made clear that "finite game" means "ends in a finite number of tosses with probability 1".
Yea there might be some misconception without a background in probability. Just want to point out that it is known in the field that if the probability of an event tends to 1 as t->infinity, it will (almost surely) happen in finite time
Can we call this a finite game? There's no guarantee the game will ever end, there's always a chance we advance to the next round.
What you’re asking for is a bounded-length game, and there is no such game (all probabilities are rational with power-of-2 denominators). The game ends with probability 1, which is good enough for the problem, I guess
True, though it fails to end with probability 0.
@@noahtaul I've just found the original problem. It specifies that a finite game ends in finite number of moves with probability 1. So the solution in the video indeed satisfies the condition.
@@afseraph Which to me seems like a non-obvious and esoteric way to define a finite game. I would define it as follows: "Given a game, G, we say that G is a finite game IFF there exists a natural number n such that G always ends in fewer than n moves."
@@chongli297 No, that's a bounded game. And as Noah pointed out, clearly a bounded game with finitely many options each move has only a finite number of outcomes, so a probability of one outcome cannot be irrational (since the coin is fair). But a game _can_ be unbounded yet finite. For example, imagine a version of Nim where the first player states a natural number, and then the second player starts playing the game with that many objects. That game will always end in finitely many moves, but there is no upper bound on how long it could go. Similarly, in the game of infinite chess (chess played on the integer lattice), there are positions where white can always mate black against perfect play, but black can delay mate by an unbounded number of moves (which is nevertheless always finite). The game of infinite chess itself is not usually finite, but restricted versions to certain positions can be, even though their length may be unbounded.
An example of a non-finite game is baseball. Not only is the length of a baseball game unbounded, in principle it could last for all of eternity. It isn't guaranteed to end at all. And this game described here is like that. So it's weird to call it a finite game. But apparently the problem statement didn't just say "finite game" but actually said that it just had to end with probability 1, which this one does (like baseball).
More interestingly, one can prove the *infinite product for sin(x)/x* using this. Note that this produces countably many independent rvs on the same probability space (without invoking Kolmogorov's extension theorem, so this construction is interesting for that reason too!), namely the i-th digit (flip) in the binary expansion of the number you get, say Y_i, is ~Ber(1/2) and since the number you get is uniformly distributed in [0,1] it tells us that
U_{(0,1)} = sum Y_i/2^i
In the sense that these 2 random variables have the _same_ distribution, so in particular, the same *characteristic function* so calculating characteristic functions on both sides, by independence of the Y_i (and some dominated convergence theorem justification), we get the infinite product formula for sin(z)/z for any complex number z !!!!!
As an electrical engineer, I don't know if I am more offended by calling bar notation non-standard or by not even using it.
6:46 - bless u !!
A simpler formulation: flip the coin until you get heads, on the k-th flip. Then you win if p_k = 1, otherwise you lose. This produces the same sum of p_k/2^k for the overall win probability but it's much more straightforward to calculate (don't need any of the branches, etc.)
And here's a shorter way to phrase the video answer: "Let's say heads=1 and tails=0. If you write alpha in binary and use the mapping to coin flips, flip the coin until right at the first place alpha diverges from *your* coin-flip sequence. You win if on this flip it's your coin, not alpha, that said tails regardless of what alpha is."
The game is almost surely finite, not excatly finite.
For the game to go indefinitely, it needs to reproduce exatly the same digits as alpha which has probabilty 0.
Here is a properly finite game, taking only one toss:
Draw a straight arrow on both sides of the fair coin and toss it. Measure the angle between the direction the arrow points to and an arbitrarily predefined vertical direction. If the angle is smaller than 1 radian, you win.
how does this involve alpha?
Wouldn't this only provide the value for a single α? Actually, I guess we can just change the 1 radian to f(α) radians to make it more encompassing for some f.
I'm pretty sure "fair coin" is just math-speak for "Bernoulli process with probability 0.5."
@@xinpingdonohoe3978It would: the rules say "if the angles are less than 1 radian apart, you win," so that was specifically for the thumbnail of alpha*pi=1.
Shocked this was an A4. Seems more like an A1. Flip until/unless you build a binary string less than alpha. 🤷♂
It *has* to be somewhere in TAoCP vol. II
Yeah, I thought the answer was immediate. To win with probability x, pick a number uniformly in (0,1) and win if your picked number is less than x. And a sequence of fair coin flips is just a uniformly random binary expansion. And if your sequence is not equal to x, you will always discover that fact after finitely many digits, so the game is finite in all but one case, thus with probability 1.
It seems to me that in order to be exactly alpha, it can't be a finite game...
6:47 Salud!
And that's a good place to sneeze
Take a disk of radius 1 and draw lines from the center to the circumference, one at zero degrees, one at 2*pi*alpha degrees. Throw a coin on the disk, if it lands in the region between 0 and 2*pi*alpha degrees, you win.
Most irrational numbers are not constructible so this won't work in general :)
It's actually quite obvious if you think about it. The setup basically tells us to choose a uniformly random number between 0 and 1 (represented by binary notation from coin flips) and so the odd of choosing a number less than alpha is precisely alpha.
I always like your video! Amazing!
There seems to be an assumption that you have an access to an oracle that tells you whether a finite binary string is less than alpha or not.
No, unless the string you build from flips turns out to be exactly equal to α, you can always tell if it is greater than or less than α after finitely many flips. Because in order for them not to be equal, they must differ at some finite position, and once they differ, you can immediately tell which is greater.
But that does mean that 0% of the time, your flips will exactly match α, and the game will not have a winner. However, the problem apparently only requires the game to end with probability 1.
It seems to me that this is not a FINITE game - look at the infinity symbol at 15:00 in the sum. A FINITE game could only include a finite number of binary digits, which means truncating α to some particular length.
And that truncation is a rational probability
Actual statement of the A4 doesn’t use the word finite game. Just terminates with probability 1.
That's a fun idea for a game. I'm gonna try that with my math students :D
i love the title of the video
Anybody already seen the MindYourDecisions on Putnam 1989-A4? I did, so I already know this one!
Here's a more direct quote of the question, rephrased so winning probability is alpha and not p: "You have a fair coin, i.e. heads and tails are equally likely. You want to win a game with any probability alpha strictly between 0 and 1 where the chance of a tie is exactly 0 and it only takes a finite number of flips to decide. How do you write the rules to set this up?"
Here are 2 ways Presh says you can do it, both after writing alpha in binary.
1. First, flip the coin until your first heads on toss n. If and only if a_n=1 do you win.
2. (I think this is the one described in the video, but I might have swapped heads and tails) For each flip, have heads=1 and tails=0. Keep flipping the coin until the first time flip k=/=a_k. You win exactly if a_k=1.
Question: why can’t the game be flip a fair coin, compute the fraction of heads which is a random variable, then pass that variable through nonlinear transformation that maps 1/2 to 1 onto a irrational number?
Then you would need to define the non-linear transformation to do anything useful with it. It is much more convenient to use binary as it converts any sequences of heads and tails to a unique number between 0 and 1
@@vamer423 thanks. Need to think about that after smoking a turkey. Happy Thanks-ful-day
so the function P stand define putnami
Bless you
a_i are in that set not between 0 and 1 ... LOL
at my age arithmetic can be challenging
Why, are you three?
Psych
He seems to be retiring your catchphrase 😢
Could you not just construct a non repeating string of binary digits like 100111000011111... and say that if you overshoot this string at any time you win and if you undershoot you lose?
but don't we need to make a proof of α being an irrational? maybe this sum always converges to a rational (i don't think it's like that, but it should be prooved imho)
@4:45 have the true value be in ascii , x(/->-x(-y->|-)//
Okay - knee jerk reaction and it is probably wrong. And what are the chances of that?
What we have done is create an algebra and a means of counting finitely many quotients that return an irrational number? (!)
In other words summing up finitely many rationals to get an irrational number. (Hastily looking for a maclaurin expansion ... Taylor expansion ... )
Yup - we have just summed up finitely many rationals to get an irrational - shrug -
Basis: alpha is irrational
The chance of carrying forward to a closing situation is finite
Summing finitely many winning options with probability a quotient in range (0, 1) will always return a rational number but alpha is irrational
Unless, of course, the number of trials extends to infinity in which stranger things may happen.
EDIT: still scratching my head a little bit and sorta have to affirm that only way an irrational can arise from sum of rationals is at infinity
Our number is not equal to alpha after a finite amount of moves, the problem requires the game to end after a finite number of moves so we have just ensured that our number will differ from alpha after a finite number of moves. In fact for our number to be exactly equal to alpha we would need an infinite number of moves but the probability of that happening is zero, because for our number to match to alpha upto n digits we need the sequence a1a2a3...an. So the number of favourable outcomes= 1 and the number of total outcomes= 2^n. So the probability of matching n digits of alpha is 1/2^n and as n-> infinity the probability convergence to 0. In other words the probability of the game never ending= 0 thus this is a finite game. I hope you understood my explanation.
Rien compris ...
So Mike did you take the Putnam and how did you do?