Hey, just wanted to tell you that i really appreciate you putting this stuff out here for free. Really enjoyed my game theory class in university this year but our prof was pretty bad with putting up learning material for the exams. This is all explained so well and has helped me so much. Is there any way to compensate you for this instead of continue watching your videos?
Grim trigger? More like "wisdom: bigger", now that I've been watching your series on game theory! I haven't quite a fun time over the past several weeks making my way through all of these videos; thanks for making them.
I find it completely contained in the way a Cocker Spaniel can't escape a Carpenter who can't get over a traditional drinking problem but still poops where it sleeps. Right next to a little Blue Book of measurements. It's a profound Equalibriam because you can't get over a mistake ⁸ until is cost you everything. It's really about Narcissistic realism.
It's kind of sad, but my conjecture here is that IRL, taking advantage of others when it suits you seems to have the highest payoff when people tend to forgive.
I’m curious why you’re adding the present value of the 4 on the right side but not the present value of 3 on the left side. When I attempted the exercise before you, I got x = 1?
Hello! Very good job! But let me ask you: 1) why it doesn't wotk to an arbitarily long game? I mean, cooperation can work for the first stages and stop afterwards. 2) Why defecting in the first period yields a profit of 4? If defecting is beneficial (i.e. a profitable deviation), then a rational opponent is expected to also defect. So it must be 2 instead of 4.
Thank you so much for sharing this! 1/2 is the result of the cooperation stage. But shouldn't we compare that with the delta in the defection stage? I don't understand how one should proceed from there. 1/2 only tells us that there is a 50% change that the game continues into a new period, I guess. I don't understand how we should use this information. And what about the delta in the defection stage? Delta ranges from 0-1, if this number is negative does this suggest that the should deviate, as they are assured that the game will end. I would love to know how I should interpret delta, and how it solves the game
I am not quite convinced that cooperation will be deterred in finite prisoner's dilemma games. It sounds sensible: all bets are off during the last game, and if the strategies in the last game are fixed then the second to last game can be considered the last game before the assured mutual defection, so we again have to make the optimal choice, etc. But look at this from a distance, and suppose we play 100 times. Is it truly unreasonable to play as if the game is infinte, up to the last game? Will I be worse off if I tried this? All in all this is very reminiscent of the unexpected hanging paradox, which uses backwards induction to get to an absurd conclusion. As far as I have understood, the impossibility of cooperation is suggested by backwards induction and nothing else. It makes me wonder whether backwards induction is really the right tool to analyze games generally. For the infinite game we cannot even use this method in the first place, but if that is the case how can we even compare finite and infinite and say one does not allow for cooperation while the other one does?
Very helpful videos, thank you so much for posting them! Quick question. In the example where player 1 deviates in the first stage, why is his payoff 4 + 2d + ...? I understand why the 4 is there, but if he's only deviating in the first stage, would he not still be playing "cooperate" in stages 2 - infinity (and thus his payoff would be 4 + 1d + ...)? Or does he intentionally deviate in stages 2 - infinity because he knows that player 2 will play the grim trigger and he will be better off by defecting for the rest of the game?
I'm a little bit confused about the one shot deviation principle in this case. If the base strategy is cooperating forever, why isn't its one shot deviation strategy "...cooperate-defect-cooperate-cooperate-cooperate..." ?
Well, that is a superficially different deviation. But it turns out that the proof in the video covers it. If deviating in the second period is profitable, then it must be true that it pays more in total for the periods 2 to infinity than maintaining the grim trigger strategy. If you set up the calculation for this, you wind up with the exact same payoffs as in the video, except everything is multiplied by \delta.
@@Gametheory101 Thanks for your reply, and here is my calculation. If I choose to cooperate forever, my total payoff is 3+3δ+3δ^2+...=3/(1-δ). If I take the one-shot-deviation-strategy and, say, defecting in the first stage and then returning to cooperate, my total payoff will be 4+1δ+1δ^2+...=4+δ/(1-δ), instant payoff beyond the first stage being 1 because my opponent has triggered to defect forever. Solving 3/(1-δ)>=4+δ/(1-δ) we get δ>=1/3, and this result is consistent no matter at which stage I choose to defect(which covers the whole space of one-shot deviation strategy), so δ>=1/3 => no one-shot deviation exists => SPE, which contradicts with the conclusion at 12:29. I can't figure out where I made the mistake...
It's funny that you have to go through all those calculations to prove that getting 3 every round is better than getting 4 one time, and then getting 2 every round for the rest of the game.
I never liked Bushes rhetoric about 'you are either with us or against us' ultimatum and it certainly didnt work for me LOL this country was always been questionable at best, so I guess that proves the point that you can't have had any previous defectors of cooperation for that to be held as an ultimatum unto to others LOL..highly informative if applied correctly to how the psychology of the group thinks in real time and how they apply this to marketing
It's the present value factor. We are in period one so we know that the payoff is going to be 3 but for later periods we need to know the present value of future payoffs. So by using delta which incorporates interest rate, we basically convert the future payoffs in terms of their value in period 1. Reply is too late. I hope that it helps
+supergreatsuper So there is some finite period by which the game MUST end, but the players don't know know in which period the game will end before that? That's a pretty straightforward model to solve if you have worked through the section on repeated games.
+William Spaniel Are you assuming that the players know that the game will end within x turns where x is finite and known? I am asking about where x is finite but not known to the players.
Then it depends on the players' beliefs about the probability distribution of x. Case 1: X has an upper bound which is known to the players. Here, the same logic of finitely repeated games applies. In the prisoner's dilemma example, we know that everyone will defect in the very last possible period, if the game lasts that long. This means everyone will defect in the second to last possible period, because there is no incentive to cooperate. Etc. (You can still get some degree of cooperation in games with multiple equillibria, using "Nash Threats." I'm sure Will covers that somewhere, but I'm not sure where.) Case 2: X does not have an upper bound. There is a constant probability p that the game will end in any given period. We can treat this as an infinitely repeated game, where p is the discount factor. (Or, if players are impatient, p is an upper bound on the discount factor.) Case 3: X does not have an upper bound, but the probability of the game ending in a given period depends on the period and/or on previous play. In this case, cooperation may still be possible, but depends on the specifics of the game and is well out of the scope of an introductory course. I believe Will interpreted your question as asking about what I am calling Case 1. -Helpful coauthor
Is there a simple explanation of what should happen if the players are only told "the game will end at some point" (but are given absolutely no information about when it will end)?
Your game theory 101 collection is absolutely insane.
These videos saved me in my econ class! Literally would of failed without you and our dear friend the stag game! Much love from Paris!
This video has the clearest explanation for Grim Trigger in UA-cam. Thank you!
thank you so much. dont usually post comments but this is one of the best educational videos ive seen
life saving! i'm having my final in an hour and this helps a lot
Good luck!
hahaha me too so life saving x
how was your final?
@@Omophagy666 this got me a B+ in micro econ
Yeah, how did your final go?
Hey, just wanted to tell you that i really appreciate you putting this stuff out here for free. Really enjoyed my game theory class in university this year but our prof was pretty bad with putting up learning material for the exams. This is all explained so well and has helped me so much. Is there any way to compensate you for this instead of continue watching your videos?
Just spread the word! (Orrrrr...you can also check out the books that I have on Amazon.)
Thank you William Spaniel. I passed microeconomy theory lesson with your videos help.
better than my economics lecturer in 18 minutes :)
🙏 I think it's been years since I first watched your videos---Glad I'm still able to return to them for a refresher in my Master's program.
How's your Master's going?
Grim trigger? More like "wisdom: bigger", now that I've been watching your series on game theory! I haven't quite a fun time over the past several weeks making my way through all of these videos; thanks for making them.
It'd be better to play this in my Game theory classes rather than taking a stupid lecture with my unqualified professor. Thanks William Spaniel
How did your classes go?
Thank you so much, you don't have to do this but you do it anyway and it is free!
I find it completely contained in the way a Cocker Spaniel can't escape a Carpenter who can't get over a traditional drinking problem but still poops where it sleeps. Right next to a little Blue Book of measurements. It's a profound Equalibriam because you can't get over a mistake ⁸ until is cost you everything. It's really about Narcissistic realism.
Thank you so much William !
helped me a lot! thank you very much!
Thanks! Finally understood it! Life's saver
Well explained. Thanks for the help
Thanks a lot ! from a Cambridge University Economics Student
weird flex but ok
Thanks William, super helpful information once more!
It's kind of sad, but my conjecture here is that IRL, taking advantage of others when it suits you seems to have the highest payoff when people tend to forgive.
wow thx m8 u rly helped me ! greetings from cologne university!
I’m curious why you’re adding the present value of the 4 on the right side but not the present value of 3 on the left side. When I attempted the exercise before you, I got x = 1?
Hello! Very good job! But let me ask you: 1) why it doesn't wotk to an arbitarily long game? I mean, cooperation can work for the first stages and stop afterwards. 2) Why defecting in the first period yields a profit of 4? If defecting is beneficial (i.e. a profitable deviation), then a rational opponent is expected to also defect. So it must be 2 instead of 4.
Thank you so much for sharing this! 1/2 is the result of the cooperation stage. But shouldn't we compare that with the delta in the defection stage? I don't understand how one should proceed from there. 1/2 only tells us that there is a 50% change that the game continues into a new period, I guess. I don't understand how we should use this information. And what about the delta in the defection stage? Delta ranges from 0-1, if this number is negative does this suggest that the should deviate, as they are assured that the game will end.
I would love to know how I should interpret delta, and how it solves the game
...covertly applied this strategy to my divorce negotiations. It saved my ass/mitigated losses in the long game; thank you.
LOL
Did you really do that?
So is the subgame perfect nash equilibrium 3, 3?
I am not quite convinced that cooperation will be deterred in finite prisoner's dilemma games. It sounds sensible: all bets are off during the last game, and if the strategies in the last game are fixed then the second to last game can be considered the last game before the assured mutual defection, so we again have to make the optimal choice, etc. But look at this from a distance, and suppose we play 100 times. Is it truly unreasonable to play as if the game is infinte, up to the last game? Will I be worse off if I tried this?
All in all this is very reminiscent of the unexpected hanging paradox, which uses backwards induction to get to an absurd conclusion. As far as I have understood, the impossibility of cooperation is suggested by backwards induction and nothing else. It makes me wonder whether backwards induction is really the right tool to analyze games generally. For the infinite game we cannot even use this method in the first place, but if that is the case how can we even compare finite and infinite and say one does not allow for cooperation while the other one does?
So excellent
Very helpful videos, thank you so much for posting them! Quick question. In the example where player 1 deviates in the first stage, why is his payoff 4 + 2d + ...? I understand why the 4 is there, but if he's only deviating in the first stage, would he not still be playing "cooperate" in stages 2 - infinity (and thus his payoff would be 4 + 1d + ...)? Or does he intentionally deviate in stages 2 - infinity because he knows that player 2 will play the grim trigger and he will be better off by defecting for the rest of the game?
Is it possible to calculate the equilibrium, if the payoffs are not the same?
I'm a little bit confused about the one shot deviation principle in this case. If the base strategy is cooperating forever, why isn't its one shot deviation strategy "...cooperate-defect-cooperate-cooperate-cooperate..." ?
Well, that is a superficially different deviation. But it turns out that the proof in the video covers it. If deviating in the second period is profitable, then it must be true that it pays more in total for the periods 2 to infinity than maintaining the grim trigger strategy. If you set up the calculation for this, you wind up with the exact same payoffs as in the video, except everything is multiplied by \delta.
@@Gametheory101 Thanks for your reply, and here is my calculation. If I choose to cooperate forever, my total payoff is 3+3δ+3δ^2+...=3/(1-δ). If I take the one-shot-deviation-strategy and, say, defecting in the first stage and then returning to cooperate, my total payoff will be 4+1δ+1δ^2+...=4+δ/(1-δ), instant payoff beyond the first stage being 1 because my opponent has triggered to defect forever. Solving 3/(1-δ)>=4+δ/(1-δ) we get δ>=1/3, and this result is consistent no matter at which stage I choose to defect(which covers the whole space of one-shot deviation strategy), so δ>=1/3 => no one-shot deviation exists => SPE, which contradicts with the conclusion at 12:29. I can't figure out where I made the mistake...
life saver!
that's called grudger, mister
It's funny that you have to go through all those calculations to prove that getting 3 every round is better than getting 4 one time, and then getting 2 every round for the rest of the game.
depends on the discount rate... for small delta its better to get the high payoff in the first round
Cant cooperation work on finite game with discount factors?
I never liked Bushes rhetoric about 'you are either with us or against us' ultimatum and it certainly didnt work for me LOL this country was always been questionable at best, so I guess that proves the point that you can't have had any previous defectors of cooperation for that to be held as an ultimatum unto to others LOL..highly informative if applied correctly to how the psychology of the group thinks in real time and how they apply this to marketing
what does delta represent?
It's the present value factor. We are in period one so we know that the payoff is going to be 3 but for later periods we need to know the present value of future payoffs. So by using delta which incorporates interest rate, we basically convert the future payoffs in terms of their value in period 1.
Reply is too late. I hope that it helps
What if it is known that the game will end, but it is not known after how many periods?
+supergreatsuper So there is some finite period by which the game MUST end, but the players don't know know in which period the game will end before that? That's a pretty straightforward model to solve if you have worked through the section on repeated games.
+William Spaniel
Are you assuming that the players know that the game will end within x turns where x is finite and known? I am asking about where x is finite but not known to the players.
Then it depends on the players' beliefs about the probability distribution of x.
Case 1: X has an upper bound which is known to the players. Here, the same logic of finitely repeated games applies. In the prisoner's dilemma example, we know that everyone will defect in the very last possible period, if the game lasts that long. This means everyone will defect in the second to last possible period, because there is no incentive to cooperate. Etc. (You can still get some degree of cooperation in games with multiple equillibria, using "Nash Threats." I'm sure Will covers that somewhere, but I'm not sure where.)
Case 2: X does not have an upper bound. There is a constant probability p that the game will end in any given period. We can treat this as an infinitely repeated game, where p is the discount factor. (Or, if players are impatient, p is an upper bound on the discount factor.)
Case 3: X does not have an upper bound, but the probability of the game ending in a given period depends on the period and/or on previous play. In this case, cooperation may still be possible, but depends on the specifics of the game and is well out of the scope of an introductory course.
I believe Will interpreted your question as asking about what I am calling Case 1.
-Helpful coauthor
Is there a simple explanation of what should happen if the players are only told "the game will end at some point" (but are given absolutely no information about when it will end)?
You forgot to put this video on the playlist! I was confused when I accidentally watched #59 right after #57...
+kikones34 Fixed, thanks.
📥💀
Does this require common knowledge?
video takes way too long