Wow! This is my favorite SoME2 video (and I have seen a lot of them by now). I just loved everything about it. At the start, like others mentioned, when I saw the 'state transition graph' my mind immediately went to stochastic matrices and how to solve the problem using some linear algebra. The solution presented is much more elegant and simple. The way you lead us to the solution makes me think that I could come up with it myself. Hope your video gets more recognition, because it definitely deserves it!
Thanks so much for the detailed response! Indeed, I was motivated to make the video because I think a lot of people know (or have seen) the state transition graph which technically could be used to compute the answer (but is actually quite tedious!) and the ABRACADABRA theorem is hidden under many layers of technical martingale theory. There are all sort of other cool calculations you can do with this technique like the exit time for a random walk and so on too....its definitely very under-known! (I'm hesitant to say underrated because its a well known trick for people who work on these problems!)
The most surprising thing i learned from this is that just because 2 events have the same probability of occurring does not mean it takes the same amount of time for both to occur! Consider the expected number of die rolls to get 6,6,6 vs. 1,2,3. Great video!
Great video! Another interesting phenomenon is called (funnily enough) Penney's game. Let's say players A and B each choose a sequence of heads/tails of length 3, and a coin is flipped until one of their sequences shows up; the player who chose that sequence wins. If B is allowed to base their choice on what A does, then B can win more than half the time no matter what A does. e.g. if A chooses HHT, then B can choose THH, which will appear before HHT with probability 3/4; if A chooses THH, then B chooses TTH and wins with probability 2/3; if A chooses TTH, then B chooses HTT and wins with probability 3/4; and if A chooses HTT, then B chooses HHT and wins with probability 2/3. It's non-transitive!
Ya this is a really cool one! I definitely wanted to avoid this in the current video because if you go "head to head" on sequences, then it's very different than counting the average number of coins. For example HT vs HH both come first 50% of the time. (One way to see that HH takes more flips on avg than HT is to notice that they both come first 50% of the time, but HH takes a few extra flips to come second when HT wins)
This is such a nice video! I remember a long time back I read about this fact online and I could not come to terms with the fact that THH is objectively easier to get than THT. This video does an amazing job of explaining such a counterintuitive concept. :D
Thanks for the kind words! I actually almost used THH vs THT as the example with more coins....you would think since they both start with T and since HH takes longer than HT that it's the other way around.
This was enlightening, I remember not getting martingales because I saw no application for something we prove can't be exploited, but the analogy with conservation of energy and the multiple examples were really helpful in seeing why one may care about them!
This is the best explanation of the problem I have seen. Similar solutions can be obtained iteratively for smaller number of characters but always wondered how to generalize it. Finally know how its done and also the fact that this problem has such a cool name.
Commenting my intuition before watching the video: An attempt at getting HH or HT starts when getting H (it doesn't really matter how many Ts you got before that). For HH, you either get another H and stop, or get T and have to start over again. For HT you either get T and stop or H and then you're right back where you just were, with an initial H hoping for a T
This was completely new to me. I understood your explanation and it's really surprising. I will need to do some calculations and simulations to really believe it. Thanks a lot.
I'm glad you liked it! I actually have some code for running simulations you can check out here. It makes a nice plot at the end of how many flips it took for each target sequence and over what period of time. (I wrote this for an earlier version of the video, but it got cut from the current version) github.com/mcnica89/manim/blob/main/Heads-Tails-vs-Head-Heads-Coinflip_Game_Sims.ipynb
Brilliant video and such an elegant method! I wonder if there's any intuitive explanation of why the more a sequence self-overlaps, the more time it takes for it to show up?
Thanks! One intuitive way is to. Punt the number of occurrences in a long sequence of flips, say n flips. On average there are n/4 occurences. But the ones with self overlap are closer together in some sense (since they can self overlap...), so the average distance between them is longer on average (compare to the non-self overlapping ones)
Your explanation is very good for getting the answer, but never explains the intuition behind the method we are using. For example, HHHT has 1 checkpoint (T at the 4th spot) and takes 16 flips to get. However HTTH has 2 checkpoints at spots 2 and 3 but takes 18! Can someone explain me why?
It's because the amount of progress "saved" at each checkpoint is different in each case. This is basically why the "checkpoint system" is ok for intuition to see why things should be different for different strings, but I think it can't be easily used to intuitively see the actual final number
Ya totally! The main reason I made the video is because it's a cool result that's (relatively) easy to understand, but every single proof I've seen online has started with "Let F_t be a filtration...."
Haha thanks so much! This video was definitely way more work because it was all animated in Manim....I did a live version at one point but it's just waaaaay to slow flipping real coins. I'm glad you enjoyed it!
Easy intuition: when HH fails it starts from scratch when HT fails it starts from halfway. So my guess is HT requires less flips. Another way of thinking about it: after H on average 2 flips to get T. On average two flips to get first H so 4 to get HT. How many flips to get two H in a sequence not necessarily adjacent? Also 4. But additional requirement they have to be adjacent. This is another reason HH should take longer. Also this problem feels like Knuth Morris Pratt.
Great video! Maybe someone can answer my question: I am trying to use the intuition of partial progress on sequences longer than 2. For example when rolling 3 dice, I can see why 1,2,3 = 6^3, as we have the option to fail back to 1 when rolling for the 2 or 3, but why does 1,1,3 take the same number when we cannot fail back to the start when rolling the second dice? why does it not take more rolls from an intuitive perspective?
This is a great question! If you compare 1,2,3 vs 1,1,3, you see that 1,1,3 has both a disadvantage (namely that you a 5/6 chance to go back to no progress when you one progress out of 3) but also an advantage( namely that you can save your progress at two progress out of three). Evidently, the net result of the advantage and disadvantage toghether is on average no effect! I dont have a good intution for why these exactly balance except for the calculation presented in the main video.
what do you mean by ' = 6^3'... solution of 6^3 is one in which you can not continue but have to start from 0 match.. for 1,2,3 solution would be 6^3 * (1- 1/6^2-1/6^3) = 6^3 * (216 - 36 - 1)/6^3 = 179 1/6^2 is change for return on 2nd - 1 1 1/6^3 is chance for return on 3rd - 1 2 1
I solved the HT with recurrence relation and summation skills in 10 minutes, but when I deal with the HH case, the recurrence relation has a solution (4/sqrt(5))*(((1+sqrt(5))/4)^n-((1-sqrt(5))/4)^n) And by calculating the infinite summation finally get the correct result after 2 hours.😂
Hi Great video. I have a quick question about how this would work for calculating the number of tosses it would take to get a 65 on a fair 6 sided die. Not sure if im using the abracadabra theorem right but for a sequence of 65 it would be 6^2 + 0. But when i solve this using recurrence relation i get 30 as the answer. I dont think im using this theorem right, would you be able to clarify this please? Thank you.
Ya it should be expected value equals 36 I think to roll 65 rolling ordinary 6 sided dice. 36 is the minimum it can take for a string if length 2. Is it possible you mixed up the 5 to be probability 1/5 by accident instead of probability 1/6?
Great stuff! But curious if the casino isn't fair could you still apply it , let's say E(Profit) = -1, Then would that just simply increase the number of average flips /rolls until desired result by 1?
Unfortunately, for unfair casinos the expected profit depends on the betting strategy. For example, a strategy with many many bets would lose more on average than a strategy with just one bet. The miracle is that for fair casinos it's always 0 no matter what, which allows us to use complicated betting strategies to get what we want.
@@MihaiNicaMath wouldn't it be impossible to win in fair casinos tho? Because on average profit = 0? And for unfair casinos < 0. Maybe i just missed something fundamental here
I'm a bit confused because this analysis would result in E[HT] == E[TH] but if our first flip is a H, HT will always come first and if it is a T, TH will always come first so HT has a 2/3 chance of coming first
Oh I see some extra detail now. In seperate coin flip runs, the expected number of flips for getting a HH is more than HT. If we were playing a game with one run of coin flips and you win and we stop if ht comes first and I win when hh comes first then we both have an equal chance of winning.
@@CS_n00b Yes that's exactly correct! PS to get the result for coins with any probability p there is a formula at 18:58 (which can generalize to the case where different letters have different p values...you just have to pay each gambler fairly for their bets)
This result is very interesting! but just wondering can you use similar technique for some unfair random processes such as a loaded coin? The most general technique I have seen uses multiplication of the stochastic matrix which works for unfair processes
Yes, the result works in exactly the same way for "unfair" coins. The only difference is that to make a fair casino, you have to change what the payouts are. For a coin with 50% chance to come up any given letter, the fair pay out for a $1 bet is $2. (Some people think of this as returing your original $1 plus paying an extra $1). For an unfair coin which has probability p to come up, the fair payout is 1/p. This is why there is a 1/p in the most general formula at 18:44 in the general statement of the ABRACADABRA theorem. I will also say that the method with matrix multiplication is simply the algebra/calculation version of the "partial progress" graph shown at 3:50 in the video...you can convert the weighted directed graph into a matrix in the standard way and then compute.
@@riccardoplati5902 Yes! You can use the same method as in this video to figure it out by running a seperate casino for each player and looking at the balance at the end
Tails-Tails is mentioned at 2:28 . It's the exact same as Heads-Heads by the symmetry between heads and tails (or you can redo the ABRACADABRA theorrm argument to see it's going to be 4+2=6 flips on average)
solution in video is wrong... HHTT takes more flips to appear, not HTHT like vid says whichever is first, call it X, sequence that is longer is one that gets next X sooner, and if at same time, then next X and so on so HTTTH against HTTHT, HTTHT takes more flips as it got next X(now H) at 4th, while HTTTH got it at 5th ... that 'stopping' rule tell nothing about the answer but about internal repeated end at the beginning like HTHHTTTTTTTTTTTTTTTTTHTHH.. where HTHH repeats
Thanks for the comment: I am not 100% able to understand your solution, but I think you might be mixing up the direct completion (i.e. flipping coins until HHTT or HTHT comes up, winner is whoever is 1st) vs the indirect-competition (i.e. two coin flippers write down how many coin flips until HHTT or HTHT comes up...how many flips did each take? Compare the number). This video is only about the indirect-competition and how many flips they take on average. I think your explanation here is about the direct competition problem (also called Penney's game)
@@MihaiNicaMath how you described those 2, there should be no difference, if both can continue from previous sequence instead of full reset from 0 ... panney;s game doesnt seem right to me (i just looked it for 2 mins), as it states that there is always a combination that wins against certain other (like rock paper scissors), but by my logic, the combination that is most likely to be first is X followed by 'non X'.. like HTTTTTTTTTTTTTT have you ran this game in a program to see the results? my logic is that, to continue from middle point m, you would have to end with patter of m that has last character opposite, and that matching is less common the more there are characters as you have to match more in order to continue from m for 2^n works, 2^n is bigger than sum of 2^i ; where i = 0,,,,,n-1 so matching 'first for length 1' gives more speed than matching every single remining if it was possible say you have HHHT(mark1)HHH(mark2)HHHHHHH in here you can match patter HHHT by failing anywhere from mark2 forward and then return to mark1, but if you fail before mark2 then you go back all the way to the beginning, excluding fail on T that goes to m=1 which i would not consider at all its 4 then 3 then 7 matches total... n = 14 so if ur at mark2, you will need 7 match you will get to mark2 every time you get first 7 match every time you fail from mark 2 you will go back to mark 1 from mark1 you have 1/8 chance to go back to mark 2 so you essentially get 1 chance for 7 match every 7 matches, and every 8th time youll get another chance at mark2 1/8 is 12.5%.. like there is 12.5% chance that you can attempt 2 times from mark 2 when you come to it so in totality you need 2^7 * 2^7*(1 - 1/9)... which is basically 2^14.. much slower than 2^13 of my ideal (that was for patter size 4) for patter of size k its 2^(2k-1) * 2^(n-2k+1)*(1- 1/(2^(k-1)+1)) so for n=100 k = 5 2^(9) * 2^(91)*(1-1/17) which is even worse improvement... even closer to 2^n for k=1 its 2^(1) * 2^(99)*(1/2) which is 2^(n-1).. my ideal (this excludes very first guess of very first character) very worst patter is one where you always go to m=0.. which is all same character... like HHHHHHHHHHHHHHHHHHH (doesnt work in formula for k=0)
Ya it's legit! The only thing you have to add to the video to finish off the proof would be to verify the conditions of the optional stopping theorem. This is a bit technical, but if you google ABRACADABRA theorem there are some proofs online (or see the video description!)
Wow! This is my favorite SoME2 video (and I have seen a lot of them by now). I just loved everything about it. At the start, like others mentioned, when I saw the 'state transition graph' my mind immediately went to stochastic matrices and how to solve the problem using some linear algebra. The solution presented is much more elegant and simple. The way you lead us to the solution makes me think that I could come up with it myself. Hope your video gets more recognition, because it definitely deserves it!
Thanks so much for the detailed response! Indeed, I was motivated to make the video because I think a lot of people know (or have seen) the state transition graph which technically could be used to compute the answer (but is actually quite tedious!) and the ABRACADABRA theorem is hidden under many layers of technical martingale theory. There are all sort of other cool calculations you can do with this technique like the exit time for a random walk and so on too....its definitely very under-known! (I'm hesitant to say underrated because its a well known trick for people who work on these problems!)
The most surprising thing i learned from this is that just because 2 events have the same probability of occurring does not mean it takes the same amount of time for both to occur! Consider the expected number of die rolls to get 6,6,6 vs. 1,2,3. Great video!
This is outrageously, midbogglingly awesome.
Great video! Another interesting phenomenon is called (funnily enough) Penney's game. Let's say players A and B each choose a sequence of heads/tails of length 3, and a coin is flipped until one of their sequences shows up; the player who chose that sequence wins. If B is allowed to base their choice on what A does, then B can win more than half the time no matter what A does. e.g. if A chooses HHT, then B can choose THH, which will appear before HHT with probability 3/4; if A chooses THH, then B chooses TTH and wins with probability 2/3; if A chooses TTH, then B chooses HTT and wins with probability 3/4; and if A chooses HTT, then B chooses HHT and wins with probability 2/3. It's non-transitive!
Ya this is a really cool one! I definitely wanted to avoid this in the current video because if you go "head to head" on sequences, then it's very different than counting the average number of coins. For example HT vs HH both come first 50% of the time. (One way to see that HH takes more flips on avg than HT is to notice that they both come first 50% of the time, but HH takes a few extra flips to come second when HT wins)
Most premium quality, knowledge full video I have ever seen.
This is such a nice video! I remember a long time back I read about this fact online and I could not come to terms with the fact that THH is objectively easier to get than THT. This video does an amazing job of explaining such a counterintuitive concept. :D
Thanks for the kind words! I actually almost used THH vs THT as the example with more coins....you would think since they both start with T and since HH takes longer than HT that it's the other way around.
This was enlightening, I remember not getting martingales because I saw no application for something we prove can't be exploited, but the analogy with conservation of energy and the multiple examples were really helpful in seeing why one may care about them!
This is the best explanation of the problem I have seen. Similar solutions can be obtained iteratively for smaller number of characters but always wondered how to generalize it. Finally know how its done and also the fact that this problem has such a cool name.
Commenting my intuition before watching the video:
An attempt at getting HH or HT starts when getting H (it doesn't really matter how many Ts you got before that). For HH, you either get another H and stop, or get T and have to start over again. For HT you either get T and stop or H and then you're right back where you just were, with an initial H hoping for a T
Great intuition! Hopefully the rest of the video shows you the nice way to calculate the exact number of flips :)
This was completely new to me. I understood your explanation and it's really surprising. I will need to do some calculations and simulations to really believe it. Thanks a lot.
I'm glad you liked it! I actually have some code for running simulations you can check out here. It makes a nice plot at the end of how many flips it took for each target sequence and over what period of time. (I wrote this for an earlier version of the video, but it got cut from the current version)
github.com/mcnica89/manim/blob/main/Heads-Tails-vs-Head-Heads-Coinflip_Game_Sims.ipynb
@@MihaiNicaMath This saved some time. Really nice. Thx.
Definitely one of the best submissions for SoME2!!
Thank you so much for the high praise! The videos this year were really very good
Great video! Interesting topic and well presented. Keep up the good work!
Thanks for the kind words!
This was a joy to watch and think along with, thank you!
Glad you enjoyed it!
Excellent presentation
Brilliant video and such an elegant method! I wonder if there's any intuitive explanation of why the more a sequence self-overlaps, the more time it takes for it to show up?
Thanks! One intuitive way is to. Punt the number of occurrences in a long sequence of flips, say n flips. On average there are n/4 occurences. But the ones with self overlap are closer together in some sense (since they can self overlap...), so the average distance between them is longer on average (compare to the non-self overlapping ones)
@@MihaiNicaMath Aha! This makes perfect sense! Thanks!
Mind blowing quality video
Brilliant video and exposition
Thanks for the kind words!
Loved it
Your explanation is very good for getting the answer, but never explains the intuition behind the method we are using. For example, HHHT has 1 checkpoint (T at the 4th spot) and takes 16 flips to get. However HTTH has 2 checkpoints at spots 2 and 3 but takes 18! Can someone explain me why?
It's because the amount of progress "saved" at each checkpoint is different in each case. This is basically why the "checkpoint system" is ok for intuition to see why things should be different for different strings, but I think it can't be easily used to intuitively see the actual final number
If we have an unfair coin where the probability of heads is 65.14%, then we should expect to observe (H, H) and (H, T) after approx. 2.535 tosses.
Brings me back to the good old days when I got completely annihilated by Martingales and Stochastic Calculus 😂
Ya totally! The main reason I made the video is because it's a cool result that's (relatively) easy to understand, but every single proof I've seen online has started with "Let F_t be a filtration...."
Maybe this is just because the general theorem/approach was new to me, but this seemed even better than your Buffon videos
Haha thanks so much! This video was definitely way more work because it was all animated in Manim....I did a live version at one point but it's just waaaaay to slow flipping real coins. I'm glad you enjoyed it!
Easy intuition: when HH fails it starts from scratch when HT fails it starts from halfway. So my guess is HT requires less flips. Another way of thinking about it: after H on average 2 flips to get T. On average two flips to get first H so 4 to get HT. How many flips to get two H in a sequence not necessarily adjacent? Also 4. But additional requirement they have to be adjacent. This is another reason HH should take longer.
Also this problem feels like Knuth Morris Pratt.
Great video!
Maybe someone can answer my question:
I am trying to use the intuition of partial progress on sequences longer than 2.
For example when rolling 3 dice, I can see why 1,2,3 = 6^3, as we have the option to fail back to 1 when rolling for the 2 or 3, but why does 1,1,3 take the same number when we cannot fail back to the start when rolling the second dice? why does it not take more rolls from an intuitive perspective?
This is a great question! If you compare 1,2,3 vs 1,1,3, you see that 1,1,3 has both a disadvantage (namely that you a 5/6 chance to go back to no progress when you one progress out of 3) but also an advantage( namely that you can save your progress at two progress out of three). Evidently, the net result of the advantage and disadvantage toghether is on average no effect! I dont have a good intution for why these exactly balance except for the calculation presented in the main video.
what do you mean by ' = 6^3'... solution of 6^3 is one in which you can not continue but have to start from 0 match..
for 1,2,3 solution would be
6^3 * (1- 1/6^2-1/6^3) = 6^3 * (216 - 36 - 1)/6^3 = 179
1/6^2 is change for return on 2nd - 1 1
1/6^3 is chance for return on 3rd - 1 2 1
I solved the HT with recurrence relation and summation skills in 10 minutes, but when I deal with the HH case, the recurrence relation has a solution
(4/sqrt(5))*(((1+sqrt(5))/4)^n-((1-sqrt(5))/4)^n)
And by calculating the infinite summation finally get the correct result after 2 hours.😂
Amazing dedication!!! Makes you appreciate the slick martingales solution even more.
Hi
Great video. I have a quick question about how this would work for calculating the number of tosses it would take to get a 65 on a fair 6 sided die.
Not sure if im using the abracadabra theorem right but for a sequence of 65 it would be 6^2 + 0.
But when i solve this using recurrence relation i get 30 as the answer. I dont think im using this theorem right, would you be able to clarify this please? Thank you.
Ya it should be expected value equals 36 I think to roll 65 rolling ordinary 6 sided dice. 36 is the minimum it can take for a string if length 2. Is it possible you mixed up the 5 to be probability 1/5 by accident instead of probability 1/6?
Shit, that was it. Thank you very much.
Great stuff! But curious if the casino isn't fair could you still apply it , let's say E(Profit) = -1, Then would that just simply increase the number of average flips /rolls until desired result by 1?
Unfortunately, for unfair casinos the expected profit depends on the betting strategy. For example, a strategy with many many bets would lose more on average than a strategy with just one bet. The miracle is that for fair casinos it's always 0 no matter what, which allows us to use complicated betting strategies to get what we want.
@@MihaiNicaMath wouldn't it be impossible to win in fair casinos tho? Because on average profit = 0? And for unfair casinos < 0. Maybe i just missed something fundamental here
Make video about aviator game's mechanism
How does this work when the coin is baised? If p(H) = 2/3 would E[flips for HH] be 9/4 + 3/2 and E[flips for HT] is 9/2 flips?
I'm a bit confused because this analysis would result in E[HT] == E[TH] but if our first flip is a H, HT will always come first and if it is a T, TH will always come first so HT has a 2/3 chance of coming first
Oh I see some extra detail now. In seperate coin flip runs, the expected number of flips for getting a HH is more than HT. If we were playing a game with one run of coin flips and you win and we stop if ht comes first and I win when hh comes first then we both have an equal chance of winning.
@@CS_n00b Yes that's exactly correct! PS to get the result for coins with any probability p there is a formula at 18:58 (which can generalize to the case where different letters have different p values...you just have to pay each gambler fairly for their bets)
This result is very interesting! but just wondering can you use similar technique for some unfair random processes such as a loaded coin? The most general technique I have seen uses multiplication of the stochastic matrix which works for unfair processes
Yes, the result works in exactly the same way for "unfair" coins. The only difference is that to make a fair casino, you have to change what the payouts are. For a coin with 50% chance to come up any given letter, the fair pay out for a $1 bet is $2. (Some people think of this as returing your original $1 plus paying an extra $1). For an unfair coin which has probability p to come up, the fair payout is 1/p. This is why there is a 1/p in the most general formula at 18:44 in the general statement of the ABRACADABRA theorem.
I will also say that the method with matrix multiplication is simply the algebra/calculation version of the "partial progress" graph shown at 3:50 in the video...you can convert the weighted directed graph into a matrix in the standard way and then compute.
On the other hand, if HH and HT play on the same coin flip sequence, they have equal chance of winning (?)
@@riccardoplati5902 Yes! You can use the same method as in this video to figure it out by running a seperate casino for each player and looking at the balance at the end
Why not include Tails-Tails?
Tails-Tails is mentioned at 2:28 . It's the exact same as Heads-Heads by the symmetry between heads and tails (or you can redo the ABRACADABRA theorrm argument to see it's going to be 4+2=6 flips on average)
solution in video is wrong...
HHTT takes more flips to appear, not HTHT like vid says
whichever is first, call it X, sequence that is longer is one that gets next X sooner, and if at same time, then next X and so on
so HTTTH against HTTHT, HTTHT takes more flips as it got next X(now H) at 4th, while HTTTH got it at 5th
...
that 'stopping' rule tell nothing about the answer
but about internal repeated end at the beginning
like HTHHTTTTTTTTTTTTTTTTTHTHH.. where HTHH repeats
Thanks for the comment: I am not 100% able to understand your solution, but I think you might be mixing up the direct completion (i.e. flipping coins until HHTT or HTHT comes up, winner is whoever is 1st) vs the indirect-competition (i.e. two coin flippers write down how many coin flips until HHTT or HTHT comes up...how many flips did each take? Compare the number). This video is only about the indirect-competition and how many flips they take on average. I think your explanation here is about the direct competition problem (also called Penney's game)
@@MihaiNicaMath
how you described those 2, there should be no difference, if both can continue from previous sequence instead of full reset from 0
... panney;s game doesnt seem right to me (i just looked it for 2 mins), as it states that there is always a combination that wins against certain other (like rock paper scissors), but by my logic, the combination that is most likely to be first is X followed by 'non X'..
like HTTTTTTTTTTTTTT
have you ran this game in a program to see the results?
my logic is that, to continue from middle point m, you would have to end with patter of m that has last character opposite, and that matching is less common the more there are characters as you have to match more in order to continue from m
for 2^n works, 2^n is bigger than sum of 2^i ; where i = 0,,,,,n-1
so matching 'first for length 1' gives more speed than matching every single remining if it was possible
say you have
HHHT(mark1)HHH(mark2)HHHHHHH
in here you can match patter HHHT by failing anywhere from mark2 forward and then return to mark1,
but if you fail before mark2 then you go back all the way to the beginning,
excluding fail on T that goes to m=1 which i would not consider at all
its 4 then 3 then 7 matches total... n = 14
so if ur at mark2, you will need 7 match
you will get to mark2 every time you get first 7 match
every time you fail from mark 2 you will go back to mark 1
from mark1 you have 1/8 chance to go back to mark 2
so you essentially get 1 chance for 7 match every 7 matches, and every 8th time youll get another chance at mark2
1/8 is 12.5%.. like there is 12.5% chance that you can attempt 2 times from mark 2 when you come to it
so in totality you need 2^7 * 2^7*(1 - 1/9)... which is basically 2^14.. much slower than 2^13 of my ideal
(that was for patter size 4)
for patter of size k its
2^(2k-1) * 2^(n-2k+1)*(1- 1/(2^(k-1)+1))
so for n=100 k = 5
2^(9) * 2^(91)*(1-1/17) which is even worse improvement... even closer to 2^n
for k=1 its
2^(1) * 2^(99)*(1/2) which is 2^(n-1).. my ideal (this excludes very first guess of very first character)
very worst patter is one where you always go to m=0..
which is all same character... like HHHHHHHHHHHHHHHHHHH
(doesnt work in formula for k=0)
Markov
Chain
Is this theorem really legit? Wow im impressed how simple this is
Ya it's legit! The only thing you have to add to the video to finish off the proof would be to verify the conditions of the optional stopping theorem. This is a bit technical, but if you google ABRACADABRA theorem there are some proofs online (or see the video description!)