I absolutely love this video ! And I want to add something ! There is "summable sequences" which allows to group the terms ! It is commutative in any way possible ! And it can be used other than in the set of the Natural numbers like real numbers, complex numbers, a cartesian product of 2 or even more set (natural, real,...). I think It could be interesting to talk about it ?
Oh yeah and for those who wants an example : the sequence (1/n) where n doesn't have the digit 7 is sommable. You can try to calculate the sum at random and you'll always arrive to the same thing 😎. You have also (1/(m+n)^4) for m, n in the set of natural numbers (> 0) which also is sommable. So you can calculate the sum going with n=1 and m=1 to 42 then change with n=64 to 333 and m=9 then etc.... You'll always have the same number at the end.
Was thinking of a, maybe not correct, but useful term I made up. “Multi convergence” or “convergence range”. Like {0,1} would be the “range” it converges on. Not sure if it could or should include the f(n) sequence too. If that makes sense.
the term you are looking for is "accumulation point". In this case, the sequence of partial sums s_n has no limit, but it has two accumulation points, namely 0 and 1. For more info, check out the wiki article on "limit points".
You have a great channel and have a gift for explaining things that would make you a great teacher. Keep up the good work. I sometimes explain why order matters this way: (a) 1/2 +1/4 +1/8 + 1/16 + ..... converges - all the terms are positive and the order you add them does not matter So the "sibling" alternating series: (b) 1/2 - 1/4 +1/8 - 1/16 + ... will also converge - the sum must be smaller than (a) Again, the order you add them up does not matter. We say that (b) converges absolutely since the series of + terms also converges. (c) 1/1 + 1/2 + 1/3 + 1/4 +....... diverges. However the alternating series: (d) 1/1 - 1/2 + 1/3 - 1/4 + ... does converge ( to ln(2) ) but not absolutely since the + terms diverge. This means that the order in which you add up (d) does matter!! (d) does not converge absolutely. Note that the 2 sub series: (e) 1/1 + 1/3 + 1/5 + ...... and (f) -1/2 - 1/4 - 1/6 - .... both diverge! That means if you fiddle with order in (d) all bets are off and you can actually make (d) converge to any real number you like - as long as you pick an appropriate order. As a good homework exercise, explain how you can fiddle with the order in (d) to make it converge to, oh I don't know, say 61 for example. So cool. And yet the alternating "sibling" for (e) does converge to pi/4 and again order matters.
It's ½, but only if you use a different, more expansive definition of an infinite sum. See Césaro summation, Abel summation, and Borel summation for details. (Warning: advanced math required.) For extra fun, check out Riemann summation, wherein 1+2+3+4+... = -1/12.
@@tomkerruish2982 OP may have had a typo in their equation "1+1-1+1..." (they probably meant "1-1+1-1..."). But yeah, if we're talking about the video's summation, then with a little hand waveyness you can let s=1-1+1-1..., then s=1-s, so s=1/2.. And the OP's post is asking 1+s, so it'd "be" 3/2. However, from the logic of convergent sequences, this sequence does not have a limit and therefore has no final value. This is interesting stuff.
@@eggtimer2 That is because the symbol a(0) + a(1) + •••, or even more generally, f[a(0)•g[a(1)•h[a(2)•...]]], where • is any binary operation of any kind, and f, g, h, ..., etc., is a sequence of functions, such a symbol has inherent meaning of its own. It is what I like to call "gibberish." The string of symbols looks like it should have some meaning, but it does not. So when we go and try to give meaning to the symbol anyway, we can come up with mutually inconsistent interpretations. For example, if you consider f(0) = 0, f(1) = (-1)^0, f(2) = (-1)^0 + (-1)^1, f(n) = (-1)^0 + ••• + (-1)^(n - 1), then we know f diverges. This much is a fact. No one disputes this. On the other hand, though, if you consider a(n + 1) = 1 - a(n), then a(n + 2) = 1 - [1 - a(n)] = 1 - 1 + a(n), a(n + 3) = 1 - 1 + 1 - a(n), and so on. If a(0) = 1/2, then a converges to 1/2, and a diverges if a(0) equals anything else. So strictly speaking, there is no reason to think of 1 - 1 + 1 - 1 + ••• as anything other than what a converges to, especially in light of the ••• at the end. More importantly, though, the real issue here at hand, aside from the semantics, is the fact that we insist of thinking of symbols such as 1 - 1 + 1 - 1 + ••• as "summation," because of the + symbol that keeps showing up. That is just not the correct way to do it. It just does not work. Even if you try, the mathematics break down into paradoxes and contradictions. No, a correct understanding of this issue would recognize that, (0) + is really a binary operation, and you need to get a good sense of what exactly it means when we write a + b + c (here, something about associativity comes to mind); (1) at the end of the day, all we are trying to do is apply some kind of operation to a sequence: here, the sequence of interest is the sequence g defined by g(n) = (-1)^n. And all of these different interpretations of the gibberish that is the symbol 1 - 1 + 1 - 1 + ••• correspond, in some sense, to different sequence operations being applied to the sequence g. Some operations are only defined for sequences that converge, but other operations are extensions of those operations, and they work on a larger class of sequences. These operations work just fine for many of our purposes and are meaningful. As such, saying 1 - 1 + 1 - 1 + ••• = 1/2 is ultimately no different than saying that (-1/2)! = sqrt(π). It is technically an abuse of notation, but all notation in mathematics is arbitrary anyway, and as long as the notation effectively communicates the concept you are trying to work with, there is no problem at hand.
I may have misunderstood what you meant to say at the end of the video, but it definitely seems to me as though you made a mistake. You said that if the limit of the sequence of partial sums exists, then any grouping will give the same limit, but according to the Riemann rearrangement theorem, this is not true, unless the sequence converges absolutely.
Grouping is not the same as rearrangement. Like I showed in the video, I'm not referring to changing the order of terms in the series, but rather to evaluating groups of terms at a time, which is equivalent to considering a subsequence of the sequence of partial sums. If a sequence converges, then any subsequence also converges to the same limit.
@@MuPrimeMath Thank you for clarifying that. I am much too accustomed to seeing people on UA-cam use the word "grouping" rather liberally, where any rearrangement is considered a "grouping." I was under the impression that you were doing the same.
The series doesn't equal 1/2 under the definition of infinite sum described in this video, since it diverges. There are other definitions of infinite sum where the series equals 1/2, but they aren't the standard definition!
It's not 1/2 at all, rearranging an infinite alternating sum makes up a new value according into Riemann series theorem check wikipedia for more information, and yet why is that 1/2? Well, that's the result of another definition of infinite series such as Cesaro's summation, Abel's summation, etc, but according into a more straightforward definition the series diverges.
Eh. I think both sides in this discussion are talking past each other. As Epic Math Time asked, what does "equal" mean? But also, what even is a series? Also, what is summation? I think both sides are taking "definitions" for granted and treating their own definitions as the standard, even there is genuinely no such a thing as "the standard definition," and at the end of the day, what a particular notation communicates depends on the context and the application of the work being done.
"We just look at some of the sequence..." But we miss the sum of the series. 😊
Great video. Think deeply about simply things.
Excellent as always. Greetings from Salzburg, Jorge
I absolutely love this video ! And I want to add something ! There is "summable sequences" which allows to group the terms !
It is commutative in any way possible ! And it can be used other than in the set of the Natural numbers like real numbers, complex numbers, a cartesian product of 2 or even more set (natural, real,...).
I think It could be interesting to talk about it ?
Oh yeah and for those who wants an example : the sequence (1/n) where n doesn't have the digit 7 is sommable.
You can try to calculate the sum at random and you'll always arrive to the same thing 😎.
You have also (1/(m+n)^4) for m, n in the set of natural numbers (> 0) which also is sommable. So you can calculate the sum going with n=1 and m=1 to 42 then change with n=64 to 333 and m=9 then etc.... You'll always have the same number at the end.
With reranging you can make this sum "equal" any whole number
Was thinking of a, maybe not correct, but useful term I made up. “Multi convergence” or “convergence range”. Like {0,1} would be the “range” it converges on. Not sure if it could or should include the f(n) sequence too. If that makes sense.
I find "bimodal" useful.
the term you are looking for is "accumulation point". In this case, the sequence of partial sums s_n has no limit, but it has two accumulation points, namely 0 and 1. For more info, check out the wiki article on "limit points".
You have a great channel and have a gift for explaining things that would make you a great teacher. Keep up the good work. I sometimes explain why order matters this way:
(a) 1/2 +1/4 +1/8 + 1/16 + ..... converges - all the terms are positive and the order you add them does not matter
So the "sibling" alternating series: (b) 1/2 - 1/4 +1/8 - 1/16 + ... will also converge - the sum must be smaller than (a) Again, the order you add them up does not matter. We say that (b) converges absolutely since the series of + terms also converges.
(c) 1/1 + 1/2 + 1/3 + 1/4 +....... diverges. However the alternating series:
(d) 1/1 - 1/2 + 1/3 - 1/4 + ... does converge ( to ln(2) ) but not absolutely since the + terms diverge.
This means that the order in which you add up (d) does matter!! (d) does not converge absolutely.
Note that the 2 sub series: (e) 1/1 + 1/3 + 1/5 + ...... and (f) -1/2 - 1/4 - 1/6 - .... both diverge! That means if you fiddle with order in (d) all bets are off and you can actually make (d) converge to any real number you like - as long as you pick an appropriate order. As a good homework exercise, explain how you can fiddle with the order in (d) to make it converge to, oh I don't know, say 61 for example. So cool.
And yet the alternating "sibling" for (e) does converge to pi/4 and again order matters.
you're the man, mu prime!
Elegantly explained Boss 💜
What's the sum 1 + 1 - 1 + 1 + 1 - 1 + ...... = ?? 🥺💖
It's ½, but only if you use a different, more expansive definition of an infinite sum. See Césaro summation, Abel summation, and Borel summation for details. (Warning: advanced math required.) For extra fun, check out Riemann summation, wherein 1+2+3+4+... = -1/12.
@@tomkerruish2982 OP may have had a typo in their equation "1+1-1+1..." (they probably meant "1-1+1-1..."). But yeah, if we're talking about the video's summation, then with a little hand waveyness you can let s=1-1+1-1..., then s=1-s, so s=1/2.. And the OP's post is asking 1+s, so it'd "be" 3/2. However, from the logic of convergent sequences, this sequence does not have a limit and therefore has no final value. This is interesting stuff.
@@tomkerruish2982 yep, these are results in what can only be described as different systems. The equal sign becomes a fickle mistress.
@@tbonbt8271 Thanks! I completely missed that distinction in the post.
@@eggtimer2 That is because the symbol a(0) + a(1) + •••, or even more generally, f[a(0)•g[a(1)•h[a(2)•...]]], where • is any binary operation of any kind, and f, g, h, ..., etc., is a sequence of functions, such a symbol has inherent meaning of its own. It is what I like to call "gibberish." The string of symbols looks like it should have some meaning, but it does not. So when we go and try to give meaning to the symbol anyway, we can come up with mutually inconsistent interpretations.
For example, if you consider f(0) = 0, f(1) = (-1)^0, f(2) = (-1)^0 + (-1)^1, f(n) = (-1)^0 + ••• + (-1)^(n - 1), then we know f diverges. This much is a fact. No one disputes this. On the other hand, though, if you consider a(n + 1) = 1 - a(n), then a(n + 2) = 1 - [1 - a(n)] = 1 - 1 + a(n), a(n + 3) = 1 - 1 + 1 - a(n), and so on. If a(0) = 1/2, then a converges to 1/2, and a diverges if a(0) equals anything else. So strictly speaking, there is no reason to think of 1 - 1 + 1 - 1 + ••• as anything other than what a converges to, especially in light of the ••• at the end.
More importantly, though, the real issue here at hand, aside from the semantics, is the fact that we insist of thinking of symbols such as 1 - 1 + 1 - 1 + ••• as "summation," because of the + symbol that keeps showing up. That is just not the correct way to do it. It just does not work. Even if you try, the mathematics break down into paradoxes and contradictions. No, a correct understanding of this issue would recognize that, (0) + is really a binary operation, and you need to get a good sense of what exactly it means when we write a + b + c (here, something about associativity comes to mind); (1) at the end of the day, all we are trying to do is apply some kind of operation to a sequence: here, the sequence of interest is the sequence g defined by g(n) = (-1)^n. And all of these different interpretations of the gibberish that is the symbol 1 - 1 + 1 - 1 + ••• correspond, in some sense, to different sequence operations being applied to the sequence g. Some operations are only defined for sequences that converge, but other operations are extensions of those operations, and they work on a larger class of sequences. These operations work just fine for many of our purposes and are meaningful. As such, saying 1 - 1 + 1 - 1 + ••• = 1/2 is ultimately no different than saying that (-1/2)! = sqrt(π). It is technically an abuse of notation, but all notation in mathematics is arbitrary anyway, and as long as the notation effectively communicates the concept you are trying to work with, there is no problem at hand.
I think this is undefined because it will osciallite between 0 and 1
If I got 1.57432116721573… did I copy?
I may have misunderstood what you meant to say at the end of the video, but it definitely seems to me as though you made a mistake. You said that if the limit of the sequence of partial sums exists, then any grouping will give the same limit, but according to the Riemann rearrangement theorem, this is not true, unless the sequence converges absolutely.
Grouping is not the same as rearrangement. Like I showed in the video, I'm not referring to changing the order of terms in the series, but rather to evaluating groups of terms at a time, which is equivalent to considering a subsequence of the sequence of partial sums. If a sequence converges, then any subsequence also converges to the same limit.
@@MuPrimeMath Thank you for clarifying that. I am much too accustomed to seeing people on UA-cam use the word "grouping" rather liberally, where any rearrangement is considered a "grouping." I was under the impression that you were doing the same.
and still this infinite series manages to be equal to 1/2...
The series doesn't equal 1/2 under the definition of infinite sum described in this video, since it diverges. There are other definitions of infinite sum where the series equals 1/2, but they aren't the standard definition!
What does "equal" mean?
It's not 1/2 at all, rearranging an infinite alternating sum makes up a new value according into Riemann series theorem check wikipedia for more information, and yet why is that 1/2? Well, that's the result of another definition of infinite series such as Cesaro's summation, Abel's summation, etc, but according into a more straightforward definition the series diverges.
Eh. I think both sides in this discussion are talking past each other. As Epic Math Time asked, what does "equal" mean? But also, what even is a series? Also, what is summation? I think both sides are taking "definitions" for granted and treating their own definitions as the standard, even there is genuinely no such a thing as "the standard definition," and at the end of the day, what a particular notation communicates depends on the context and the application of the work being done.