Describing the Monty hall paradox following this context would have been amazing. I hope you do it in future. Specially making distinctions between the two premises.
Just what I was thinking as I was watching. The Monty Hall problem isn't just relevant to Let's Make a Deal. It shows up in disguise as the important Principle of Restricted Choice in the card game of bridge (as well as other places). An example of it in action is the following. Say you are missing the queen and king of some suit. If a player wins a trick with, say, the queen, odds are reduced that the same player has the king. Like Monty Hall opening a curtain, the player's choice of which card to reveal (in order to win the trick) may have been restricted. Hence the name.
I think Dr Sean nails it by pointing out there's only a 50% chance you have x. The other 50% chance must surely be that you have 2x or x/2. The confusion normally sets in as soon as a typical presenter starts off with something like "Let's say you have x in your envelope" as if by stipulating that they put it beyond serious question, and it's now just a matter of 2 possible amounts in the other envelope, 2x or x/2. Of course looked at like that it makes sense to switch. But we rarely get someone saying "Let's say you have 2x or x/2 in your envelope", though that would make it disadvantageous to switch to one with x. (The same applies as much when numerical values are used instead of x, like starting off with "Let's say you have $20 in your envelope".)
I saw this paradox differently. The first assumption considers the fact that you've gained x and x is a positive number larger than 0. Therefore, losing half of x implies the difference must be less than what we can gain by doubling x, because dividing x by 2 is always a smaller difference than multiplying x by 2. So, I agree with guy 1...The odds of multiplying x for a double gain vs dividing x for a half gain make the implied odds worth the risk.
The values in the envelopes are fixed (this is crucial!). If you change the envelope, the gain/loss is always equal to the lower of the values in the envelopes. Let me demonstrate this. Let x equal the amount we've gained. Since the values in the envelopes are fixed, we can denote the lower value as y and the higher as 2y. If we change the envelope, there are two different cases: 1. We picked the lower value (x=y) and changing would net us 2y-y=y extra. 2. We picked the higher value (x=2y) and changing would lose us 2y-y=y. Either way by changing we always gain/lose y. And assuming the choice in the beginning was random (50-50) the expected value of gain/loss by changing is 0.5*y + 0.5*(-y) = 0.
One nice thing about this paradox is that you can test your ideas about it experimentally. If you know how to program, it's a pretty easy one to write. If not, just ask a friend to randomly put some value x in one envelope (or hand), 2x in another, and play the game. Do it over and over and see how you do with a strategy of always switching.
Bob: Hey Alice, want to play two envelopes? Alice: OK. How's it go? Bob: I got a $10 bill and a $20 bill. I put them each into separate identical envelopes like so, seal them and now shuffle them so we don't know which is which. Pick one and you get to keep it. Alice: Cool. Uh..I'll have that one. Bob: What's even cooler is you can switch to the other envelope. Alice: But why? If I have the $10 then I gain by switching. But if I have the $20 I lose. But I don't know what I got, so how can I know what I get? Bob: Look at it this way. Let's say you have x in your envelope. Then that means the other envelope has half x or double x. So you stand to gain double what you stand to lose if you switch. Makes sense. Alice: So if I have the $10 then the other envelope has $20 or $5. But you never said nothing about $5, I never saw you put it in an envelope. Bob: Don't get awkward on me Alice. Alice: And if I have the $20 then the other envelope has $40 or $10. But you never put $40 in an envelope. So we're back to what I - and you - said before. It's just comes down to either $10 or $20 and we don't know which envelope they're in. Bob: I just thought it might help to think about it algebraically. Alice: OK let's think about it algebraically then. If I have x then the other envelope has half x or double x. But I don't know I have x. It could be that I have the half x or double x envelope, in which case switching for x stands to lose me double what I stand to win. Once again, I don't know what I have so I don't know what I get. Thanks but no thanks Bob.
Bob: Want to play two envelopes again Alice? Alice: Like I said last time it's a pointless exercise. Bob: You complained you didn't know what you had, so you didn't know whether to switch or not. This time you can open the envelope you picked before switching. Here are the two envelopes. One has twice the other, like before. Alice: Hm Ok. I'll have this one. I open it and .... $30. Cool. Bob: Want to switch? Alice: So the other has either $60 or $15 with equal probability. I stand to gain twice as much as I stand to lose, right? Bob: Yep Alice: But if I'd picked the $15 envelope, then the other would have had $30 or $7.50, right? And if I'd picked the $60 envelope, the other would have had $120 or $30. And if I'd picked the $7.50 envelope then the other would have had $15 or $3.75. And if I'd picked the $120 envelope ... well we could go on forever. Bob: I guess. Alice: What do you mean you guess? Don't you know what went into the envelopes? Bob: This time all I know is what I told you, one has double the other, and now as we've seen, one has $30. But whichever the other has, double or half, it's worth taking a chance. What do you think, is it worth Alice switching under these conditions? Do the conditions even make sense?
I encourage you to make a commentary or analysis on the TED-ED frog riddle. It is quite interesting problem that involves sample space and probabilities and is quite confusing to many people (including myself) :D
Your solution is more insightful than most, but in my opinion falls a little short. Part if that is that I would eplain the point you made a little differently. It all boils down to the fact that the amount of money in the envelopes is a random variable, but it is only _one_ independent random variable. You can use two, but then the second is dependent on the first. So, as you pick an envelope, you should use one random variable, T, to represent the _total_ amount in the two envelopes. The set of values t that T can take is part of the sample space.The problem is that we do not know what that space is, or what its probability distribution is. This turns out to not matter, since there is another random variable, R, for whether the envelope you pick is relatively the low value, or the high value. The set of values r that R can take is {1/3,2/3}, with probability distribution (50%,50%). Finally, the amount in your envelope is r*t. The expected value, with respect to R is (1/3)*t*(1/2)+(2/3)*t*(1/2) = t/2. It is the same as the expected value of the other envelope. We don't know what the probability distribution of T is, because t is used as a single value in this calculation. But that changes if we learn x, the value in our envelope. Then either t=3x, or t=3x/2. And now we need to include these two values of T in our expectation. For that, we need to know, at least, the relative probabilities of T=3 and T=3x/2. This is something we have no way of knowing. One conclusion where you are wrong, is around 3:40. Hobbes is not right that your chances are the same whether or not you switch. They are actually unknowable; much like asking how likely it is that a random child's favorite color is more likely to be green or blue. You might have a good guess, based on what colors you think a child might like, but you don't know. And in fact, there are distributions where it is (almost) always favorable to switch, _after_ you learn what is in your envelope. All you need is a probability distribution where the probability distribution increases as t increases. There are two problems with that: 1) To be true for all cases, there is no upper bound for the amount of money that could potentially be put in the envelopes; otherwise.... 2) If your envelope has 2*t_max/3, you will lose money by switching. A lot of money. Enough to make the expectation, if you go into the game already knowing you will switch, to be that there is no gain.
This guy started his channel exactly right. Have 5 videos already made when you launch. Good job following MatPat on this theory!
Describing the Monty hall paradox following this context would have been amazing. I hope you do it in future. Specially making distinctions between the two premises.
Thanks for the suggestion, I have it on my list!
Just what I was thinking as I was watching. The Monty Hall problem isn't just relevant to Let's Make a Deal. It shows up in disguise as the important Principle of Restricted Choice in the card game of bridge (as well as other places). An example of it in action is the following. Say you are missing the queen and king of some suit. If a player wins a trick with, say, the queen, odds are reduced that the same player has the king. Like Monty Hall opening a curtain, the player's choice of which card to reveal (in order to win the trick) may have been restricted. Hence the name.
FYI, he did do this. Just in case you haven't seen
I think Dr Sean nails it by pointing out there's only a 50% chance you have x. The other 50% chance must surely be that you have 2x or x/2.
The confusion normally sets in as soon as a typical presenter starts off with something like "Let's say you have x in your envelope" as if by stipulating that they put it beyond serious question, and it's now just a matter of 2 possible amounts in the other envelope, 2x or x/2. Of course looked at like that it makes sense to switch.
But we rarely get someone saying "Let's say you have 2x or x/2 in your envelope", though that would make it disadvantageous to switch to one with x. (The same applies as much when numerical values are used instead of x, like starting off with "Let's say you have $20 in your envelope".)
The expected gain calculation really puzzled me. Would love your take on Monty hall as well!
I saw this paradox differently. The first assumption considers the fact that you've gained x and x is a positive number larger than 0. Therefore, losing half of x implies the difference must be less than what we can gain by doubling x, because dividing x by 2 is always a smaller difference than multiplying x by 2.
So, I agree with guy 1...The odds of multiplying x for a double gain vs dividing x for a half gain make the implied odds worth the risk.
The values in the envelopes are fixed (this is crucial!). If you change the envelope, the gain/loss is always equal to the lower of the values in the envelopes.
Let me demonstrate this. Let x equal the amount we've gained. Since the values in the envelopes are fixed, we can denote the lower value as y and the higher as 2y. If we change the envelope, there are two different cases:
1. We picked the lower value (x=y) and changing would net us 2y-y=y extra.
2. We picked the higher value (x=2y) and changing would lose us 2y-y=y.
Either way by changing we always gain/lose y. And assuming the choice in the beginning was random (50-50) the expected value of gain/loss by changing is
0.5*y + 0.5*(-y) = 0.
The video just explained why that intuition is wrong
One nice thing about this paradox is that you can test your ideas about it experimentally. If you know how to program, it's a pretty easy one to write. If not, just ask a friend to randomly put some value x in one envelope (or hand), 2x in another, and play the game. Do it over and over and see how you do with a strategy of always switching.
Bob: Hey Alice, want to play two envelopes?
Alice: OK. How's it go?
Bob: I got a $10 bill and a $20 bill. I put them each into separate identical envelopes like so, seal them and now shuffle them so we don't know which is which. Pick one and you get to keep it.
Alice: Cool. Uh..I'll have that one.
Bob: What's even cooler is you can switch to the other envelope.
Alice: But why? If I have the $10 then I gain by switching. But if I have the $20 I lose. But I don't know what I got, so how can I know what I get?
Bob: Look at it this way. Let's say you have x in your envelope. Then that means the other envelope has half x or double x. So you stand to gain double what you stand to lose if you switch. Makes sense.
Alice: So if I have the $10 then the other envelope has $20 or $5. But you never said nothing about $5, I never saw you put it in an envelope.
Bob: Don't get awkward on me Alice.
Alice: And if I have the $20 then the other envelope has $40 or $10. But you never put $40 in an envelope. So we're back to what I - and you - said before. It's just comes down to either $10 or $20 and we don't know which envelope they're in.
Bob: I just thought it might help to think about it algebraically.
Alice: OK let's think about it algebraically then. If I have x then the other envelope has half x or double x. But I don't know I have x. It could be that I have the half x or double x envelope, in which case switching for x stands to lose me double what I stand to win. Once again, I don't know what I have so I don't know what I get. Thanks but no thanks Bob.
Bob: Want to play two envelopes again Alice?
Alice: Like I said last time it's a pointless exercise.
Bob: You complained you didn't know what you had, so you didn't know whether to switch or not. This time you can open the envelope you picked before switching. Here are the two envelopes. One has twice the other, like before.
Alice: Hm Ok. I'll have this one. I open it and .... $30. Cool.
Bob: Want to switch?
Alice: So the other has either $60 or $15 with equal probability. I stand to gain twice as much as I stand to lose, right?
Bob: Yep
Alice: But if I'd picked the $15 envelope, then the other would have had $30 or $7.50, right? And if I'd picked the $60 envelope, the other would have had $120 or $30. And if I'd picked the $7.50 envelope then the other would have had $15 or $3.75. And if I'd picked the $120 envelope ... well we could go on forever.
Bob: I guess.
Alice: What do you mean you guess? Don't you know what went into the envelopes?
Bob: This time all I know is what I told you, one has double the other, and now as we've seen, one has $30. But whichever the other has, double or half, it's worth taking a chance.
What do you think, is it worth Alice switching under these conditions? Do the conditions even make sense?
I encourage you to make a commentary or analysis on the TED-ED frog riddle. It is quite interesting problem that involves sample space and probabilities and is quite confusing to many people (including myself) :D
Thanks! I'll check it out.
Your solution is more insightful than most, but in my opinion falls a little short. Part if that is that I would eplain the point you made a little differently.
It all boils down to the fact that the amount of money in the envelopes is a random variable, but it is only _one_ independent random variable. You can use two, but then the second is dependent on the first.
So, as you pick an envelope, you should use one random variable, T, to represent the _total_ amount in the two envelopes. The set of values t that T can take is part of the sample space.The problem is that we do not know what that space is, or what its probability distribution is.
This turns out to not matter, since there is another random variable, R, for whether the envelope you pick is relatively the low value, or the high value. The set of values r that R can take is {1/3,2/3}, with probability distribution (50%,50%). Finally, the amount in your envelope is r*t. The expected value, with respect to R is (1/3)*t*(1/2)+(2/3)*t*(1/2) = t/2. It is the same as the expected value of the other envelope. We don't know what the probability distribution of T is, because t is used as a single value in this calculation.
But that changes if we learn x, the value in our envelope. Then either t=3x, or t=3x/2. And now we need to include these two values of T in our expectation. For that, we need to know, at least, the relative probabilities of T=3 and T=3x/2. This is something we have no way of knowing.
One conclusion where you are wrong, is around 3:40. Hobbes is not right that your chances are the same whether or not you switch. They are actually unknowable; much like asking how likely it is that a random child's favorite color is more likely to be green or blue. You might have a good guess, based on what colors you think a child might like, but you don't know.
And in fact, there are distributions where it is (almost) always favorable to switch, _after_ you learn what is in your envelope. All you need is a probability distribution where the probability distribution increases as t increases. There are two problems with that:
1) To be true for all cases, there is no upper bound for the amount of money that could potentially be put in the envelopes; otherwise....
2) If your envelope has 2*t_max/3, you will lose money by switching. A lot of money. Enough to make the expectation, if you go into the game already knowing you will switch, to be that there is no gain.
Good video, but I believe this comment is correct. There is indeed a problem in the video which is resolved by your comment.
Never heard this paradox before, may have to use it on someone else...
Can’t I just take both envelopes and call it good?