For anyone wondering about the thumbnail, which shows the sum of 1/(1 + sqrt(2) + sqrt(3) + . . . + sqrt(n)): This sum converges. This is because 1 + sqrt(2) + . . . + sqrt(n) is Θ(n^(3/2)) (this can be seen by using upper/lower integrals of sqrt(x)).
I love it when your instruction is easy enough to understand for us mere mortals with high school math. It makes us feel smarter :). Tx for this excellent vid!
@@angelmendez-rivera351 uhhh yeah. It’s a symmetrical open curve formed by the intersection of a circular cone with a plane at a smaller angle with its axis than the side of the cone…..
No "mortal" with just high school math can understand it easily. If you truly can understand it with little to no difficulty, you definitely possess some degree of mathematical talent beyond just "normal mortals". You aren't the next Ramanujan, but you're noticeably above normal people, quite easily so in fact
@@musaratjahan7954 I doubt that very much, but thanks though :). It’s possible, perhaps, that the educational system pre-uni, in The Netherlands, where I’m from, goes beyond what high school in the US covers. In the last two years (we’re 17 or 18 then) we cover limits, (partial) differentiations, integrals and complex numbers. And a bunch of stuff I’ve forgotten about ;). We did do (infinite) series and sums, though I don’t remember that very well.
Very cool, and a clear and sweet explanation. I am an Arab from Syria. I am preparing a PhD thesis in applied mathematics. I have been following you for a year.
The series in the thumbnail is the reciprocal of the sum of the square roots sqrt(1) + sqrt(2) + ... + sqrt(n). If anybody is wondering, that series is convergent and can be checked by using the fact that the series with general term 1/n^(1+epsilon) is convergent for any epsilon > 0.
@@joeg579 For even n, n/2 of the summands are greater or equal to sqrt(n/2). For odd n, (n-1)/2 of the summands are greater or equal to sqrt((n-1)/2). In either case the sum is greater than (n-1)/2 times sqrt((n-1)/2), so after separating the n = 1 term and taking the reciprocal we see that the sum is smaller than 1 + 2sqrt(2)∑ 1/n^(3/2).
best way to prove the limit of n^(1/n) is 1: we know for n>1 n^(1/n)>1. so, we can define a new positive sequence e_n = n^(1/n) - 1. this means n^(1/n) = e_n + 1. take the nth power of both sides to get n = (e_n + 1)^n and use the binomial theorem: n = 1 + ne_n + (n(n-1)/2!) * e_n^2 + … + e_n^n, which gives us the inequality n > (n(n-1)/2!) * e_n^2 => 2 > (n-1)e_n^2 => 0 < e_n < sqrt(2/(n-1)) (recall e_n is positive). taking the limit, e_n is 0 using squeeze theorem and so n^(1/n) is 1.
An easier way to prove that the series diverge is using Stolz-Cesaro Theorem to prove that, calling a_n the denominator of the argument of the series, you have that a_n/n tends to 1 as n approaches infinity. So the series has the same behavior of the harmonic series, which diverges.
Is the Stolz-Cesaro theorem taught in an elementary calculus class? I ask, because I'm wondering if the reason Prof. Penn used the approach he choose is because he wasn't sure he could sneak in another convergence theorem without making the video more complicated than it needed to be.
This would have been a lot easier to work out from the thumbnail, if the thumbnail were accurate. The sum in the thumbnail is roughly 2/3 Zeta(3/2) (where Zeta(x) is the Riemann zeta function). This comes from a first order approximation of the sum of consecutive squares, using the Euler-MacLaurin formula. I gave up on getting an accurate answer after I realized I'd already spent 90 minutes on a problem tackled in a 9 minute video. Imagine my surprise when I discovered I had been solving the wrong problem. Correction: the sum in the thumbnail would be approximately 3/2 Zeta(3/2). This is because the sum of the first n square roots is approximately 2/3 n sqrt(n). The missing correction terms in that approximation have pulled the true ratio down to about 1.21, at the 2 millionth partial sum.
In the thumbnail problem, you can also just bound the denominators by the integral of of sqrt(x) and then the thumbnail series is bounded by a p series with p=3/2
Once you've proven that the nth root of n converges to 1, you can just name M the maximal element of the sequence "nth root of n" and lower-bound the kth term of the sum by 1/Mk, hence the sum diverges. (you know that a maximal element of the sequence exists by taking a fixed value for epsilon (say 0.1), you know that after some index N, all the elements are below 1+eps, you can then define M as the maximum of the N-1 first elements of the sequence and 1+eps)
An extremely lightweight solution (no L'Hopital!): For all n>=1, n < 1 + n < e^n (which is immediate from the Taylor series of e^x, or from the convexity of e^x), and hence n^(1/n) < e. Thus S_n: = sum_{m=1}^n m^(1/m) < e * n, and thus 1/S_n > 1/(e*n) and so the series diverges by the comparison test and the divergence of the Harmonic series.
Why use induction instead of simply continuing to treat x^(1/x) as a continuous function and finding the global maximum? x^(1/x)=max ln(x)/x = max (since ln is strictly increasing) ln(x)*1/x = (1-ln(x))/x^2 = 0 or undefined ln(x) = 1 or x^2 = 0 x = e or 0 We can rule out x = 0 since it is outside of the domain, so we know the maximum occurs at one of the endpoints 1 or infinity (where the function is zero) or at e. e^(1/e) is clearly less than the square root of 3 since e is between 2 and 3, so it is clearly less than 2. Since this is the global maximum, n^(1/n) is always less than 2.
You can also show the limit of n^(1/n) simply with the epsilon delta definite Let ε>0 choose N =2/ε^2 then for all n>N: 2/ε^2 < n 2 < nε^2 1 < 1/2 nε^2 n -1 < n(n-1)/2 ε^2 n < 1 + n(n-1)/2 ε^2 n < (1 + ε)^n |n^(1/n) - 1| < ε
Instead of using 2 one could take also the maximal value for n^(1/n). Arguing that x^(1/x) is a continuous function we deduce that the maximal value must be finite. Then we dont need any further proof and get one over that maximal value instead of 1/2 as a common factor in the end.
No limiting argument is necessary: we can immediately observe that the nth root of n can never be < 1 (or else the nth power of that root would be less than 1) and can never be greater than 2 because of (2^n)/n > 1 for all naturals and take the nth root of both sides.
The denominators in the general term behave for large n, by the criterium of comparing the sum to an integral from 1 to n, like n+(1/2)log^2 n+O(1)=n(1+o(1)) as n goes to infinity, and so the series diverges like the harmonic series.
On the board is the multiplicative inverse of the sum n^(1/n) The 'dot dot dot' notation is confusing if not false What you end up solving is the sum of the inverse of the partial sum of n^(1/n)
Curious, but if an armature, but because we know that sum_0^inf 1/x is divergent. And I could be very wrong about this, but wouldn’t any power of x which makes the denominator smaller also be divergent?
Since any positive integer power of any positive integer is greater than or equal to one (and only equal to one in the trivial case), why did you allow for the nth root of n to be less than 1?
I really REALLY need to learn to not watch your videos at 1:30am. It's fascinating, but the logical part of my brain that actually understands what the hell you're talking about is already asleep.
You can also show n^(1/n) < 2 with the derivative d/dx (x^(1/x)) = -x^(-2 + 1/x) (-1 + log(x)) := f(x) lim of f(x) to inf is 0 (easy to see) lim of f(x) to 0 is also 0 (also easy to see) now solve f(x)=0 => x=1 f'(x) = x^(-4 + 1/x) (1 - 3 x + 2 (-1 + x) log(x) + log^2(x)) f'(1) = -2 so this is our only maximum 1^(1/1) = 1 and 1 < 2
the sum of sqrt(k) from k=1 to n is bounded by (2/3)n^(3/2) from below and (2/3)((n+1)^(3/2)-1) from above, and the series of 1/n^(3/2) converges, so it would converge as well
It should be obvious that n is greater than 1to the nth and less than 1+1 to the nth. Too much time was spent on showing the nth root of n is between 1 and 2.
Remembering the proof that harmonic series diverges I guess there is. But then a less rigorous question would be: what other interesting series diverge and are less than harmonic?
I would argue that the sequence with n (as a function N -> R) is indeed continuous, yet it is not differentiable. Still I agree, that using 'x' instead of 'n' helps with understanding that a sequence might be a function but not all functions are differentiable.
Just because something is larger than a diverging series doesn't necessarily mean it diverges though right so Inpwuldnt say it like that, but other than that the proof is sound. Did anyone else think he could've worded that better maybe?
For a infinite sum of a sequence to converge, the limit of the sequence must equal 0. So, when we have that the limit of the nth root of n equals 1, we have that the sequence that defines the series also has a limit of 1, and because the limit of the sequence is not 0, the series must diverge.
You just compare it with an integral. If you have square roots, then compare the reciprocal of the sum of square roots from 1 to n with the reiprocal of the integral from 0 to n of square root of x. You have to show that the reciprocal of the sum is smaller than the reciprocal of the integral and then when you compute the integral just use the p-series test to see that it converges and then the original converges by the direct compariso test.
Here's my proof of ∀n : ℕ, n > 0 . root n n < 2 root n n < 2 n < 2^n (which is obviously true, so we're on the right track) 0 < 1, but let's not take the 0th root of 0, so 1 < 2 is our base case 2^(n + 1) = 2 ⋅ 2^n = 2^n + 2^n 1 < 2^n => n + 1 < n + 2^n n < 2^n => n + 2^n < 2^n + 2^n => n + 1 < 2^n + 2^n = 2^(n + 1)
By a very non-rigorous approach, here's what I thought looking at this. n^1/n is strictly larger than 1 for n>1. But obviously will get smaller as n grows as as the base increases linearly with n and the exponent decreases inversely. √2 < 2, So in all, the sum is between the harmonic series and twice the harmonic series. Both of those diverge, so this diverges too.
"our goal object is larger", doesn't that mean it can potentially converge? You still need to prove it's not as large the harmonic series. If it's larger than that, it's converging. It's easy to show, but I feel that last statement should be clarified. After all, you're upper bounding the denominator, while it's not imm clear what the lower bound is. Edit (few days hence): my statement makes no sense, I mixed/confused con- vs divergent here…
He says the answer in the end diverges which means it doesn't converge because its bigger than a sum which divergers and the only thing bigger than infinity is infinity
No, since every term of the original series is greater (or equal) to the corresponding term in the harmonic series, summing all of them up must give a sum greater than that of the harmonic series (you would do this with the partial sums to be more rigorous). Since the harmonic series diverges, i.e. its sum "equals" ∞, we have that the sum of the original series is greater than "∞", and no real number satisfies such a property. This is sometimes called the minorant criterion, and it works in the opposite way for convergence as well. In that case, you look for a series known to be convergent where every term is greater than the corresponding term of the series whose convergence you would like to investigate. Provided both series have only positive terms, the majorant then establishes the convergence of the series you are interested in (this can be generalised at least to the extent of working with the absolute value of the terms and hence establishing absolute convergence).
@@alexanderbasler6259 thanks, that’s an excellent explanation. @all Rereading my own comment, I must’ve confused the terms con- vs divergent. Seeing it again, it doesn’t make any sense at all to myself anymore 😵💫
For anyone wondering about the thumbnail, which shows the sum of 1/(1 + sqrt(2) + sqrt(3) + . . . + sqrt(n)): This sum converges. This is because 1 + sqrt(2) + . . . + sqrt(n) is Θ(n^(3/2)) (this can be seen by using upper/lower integrals of sqrt(x)).
Or even simpler: 1 + sqrt(2) + . . . + sqrt(n) > (n/2) *sqrt(n/2).
or because its basically = 1/sqrt(infinity) =1/infinity?
@@12321dantheman that's not enough for the divergence of a sum. As an example, the harmonic series.
@@12321dantheman you forgot that we add up all the fractions
@@yuvalid4156 for harmonic series, a(n+1) > a(n) while for 1/sum(sqrt(n)) it's the other way around: b(n+1)
I love it when your instruction is easy enough to understand for us mere mortals with high school math. It makes us feel smarter :). Tx for this excellent vid!
Why do you say mere mortals Luke you're less than? Ramanujan was just a mere mortal too as was Dirac etc.
@@angelmendez-rivera351 uhhh yeah. It’s a symmetrical open curve formed by the intersection of a circular cone with a plane at a smaller angle with its axis than the side of the cone…..
No "mortal" with just high school math can understand it easily. If you truly can understand it with little to no difficulty, you definitely possess some degree of mathematical talent beyond just "normal mortals". You aren't the next Ramanujan, but you're noticeably above normal people, quite easily so in fact
@@musaratjahan7954 I doubt that very much, but thanks though :). It’s possible, perhaps, that the educational system pre-uni, in The Netherlands, where I’m from, goes beyond what high school in the US covers. In the last two years (we’re 17 or 18 then) we cover limits, (partial) differentiations, integrals and complex numbers. And a bunch of stuff I’ve forgotten about ;). We did do (infinite) series and sums, though I don’t remember that very well.
7:05 Ah doktor penn, you always know how to keep the simplest of questions to your viewers, it is most definitely k=1.
Very cool, and a clear and sweet explanation. I am an Arab from Syria. I am preparing a PhD thesis in applied mathematics. I have been following you for a year.
The series in the thumbnail converges thou
I used a similar strategy.
f(x)=x^(1/x) achieves maximum at x=e and decreasing after e. So for x>=3, f(x)=
The series in the thumbnail is the reciprocal of the sum of the square roots sqrt(1) + sqrt(2) + ... + sqrt(n). If anybody is wondering, that series is convergent and can be checked by using the fact that the series with general term 1/n^(1+epsilon) is convergent for any epsilon > 0.
how do we know that sqrt(1) + sqrt(2) + ... + sqrt(n) is asymptotic to 1/n^1.5?
@@joeg579 For even n, n/2 of the summands are greater or equal to sqrt(n/2). For odd n, (n-1)/2 of the summands are greater or equal to sqrt((n-1)/2). In either case the sum is greater than (n-1)/2 times sqrt((n-1)/2), so after separating the n = 1 term and taking the reciprocal we see that the sum is smaller than 1 + 2sqrt(2)∑ 1/n^(3/2).
@@Jaeghead That's precisely the argument I also used.
@@joeg579 It's not necessary to be asymptotic, it suffices to get an upper bound.
Oh, I thought I had misread the thumbnail, as I was coming up with the same idea and conclusion. And then the video takes nth roots.
best way to prove the limit of n^(1/n) is 1: we know for n>1 n^(1/n)>1. so, we can define a new positive sequence e_n = n^(1/n) - 1. this means n^(1/n) = e_n + 1. take the nth power of both sides to get n = (e_n + 1)^n and use the binomial theorem: n = 1 + ne_n + (n(n-1)/2!) * e_n^2 + … + e_n^n, which gives us the inequality n > (n(n-1)/2!) * e_n^2 => 2 > (n-1)e_n^2 => 0 < e_n < sqrt(2/(n-1)) (recall e_n is positive). taking the limit, e_n is 0 using squeeze theorem and so n^(1/n) is 1.
An easier way to prove that the series diverge is using Stolz-Cesaro Theorem to prove that, calling a_n the denominator of the argument of the series, you have that a_n/n tends to 1 as n approaches infinity. So the series has the same behavior of the harmonic series, which diverges.
Is the Stolz-Cesaro theorem taught in an elementary calculus class? I ask, because I'm wondering if the reason Prof. Penn used the approach he choose is because he wasn't sure he could sneak in another convergence theorem without making the video more complicated than it needed to be.
I believe he has already made several videos on the Cesaro-Stolz theorem and just wanted to argue this one more directly.
This would have been a lot easier to work out from the thumbnail, if the thumbnail were accurate. The sum in the thumbnail is roughly 2/3 Zeta(3/2) (where Zeta(x) is the Riemann zeta function). This comes from a first order approximation of the sum of consecutive squares, using the Euler-MacLaurin formula. I gave up on getting an accurate answer after I realized I'd already spent 90 minutes on a problem tackled in a 9 minute video. Imagine my surprise when I discovered I had been solving the wrong problem.
Correction: the sum in the thumbnail would be approximately 3/2 Zeta(3/2). This is because the sum of the first n square roots is approximately 2/3 n sqrt(n). The missing correction terms in that approximation have pulled the true ratio down to about 1.21, at the 2 millionth partial sum.
F
In the thumbnail problem, you can also just bound the denominators by the integral of of sqrt(x) and then the thumbnail series is bounded by a p series with p=3/2
Once you've proven that the nth root of n converges to 1, you can just name M the maximal element of the sequence "nth root of n" and lower-bound the kth term of the sum by 1/Mk, hence the sum diverges.
(you know that a maximal element of the sequence exists by taking a fixed value for epsilon (say 0.1), you know that after some index N, all the elements are below 1+eps, you can then define M as the maximum of the N-1 first elements of the sequence and 1+eps)
You know a maximum exists by Weierstrass theorem: Continuous and 1=f(1)lim f(x) = 1
An extremely lightweight solution (no L'Hopital!):
For all n>=1, n < 1 + n < e^n (which is immediate from the Taylor series of e^x, or from the convexity of e^x), and hence n^(1/n) < e.
Thus S_n: = sum_{m=1}^n m^(1/m) < e * n, and thus 1/S_n > 1/(e*n) and so the series diverges by the comparison test and the divergence of the Harmonic series.
To show that x^(1/x)=3. (Its derivative is negative for x>=3.) So n^(1/n)
For any positive integer i: i
Why use induction instead of simply continuing to treat x^(1/x) as a continuous function and finding the global maximum?
x^(1/x)=max ln(x)/x = max (since ln is strictly increasing) ln(x)*1/x = (1-ln(x))/x^2 = 0 or undefined
ln(x) = 1 or x^2 = 0 x = e or 0
We can rule out x = 0 since it is outside of the domain, so we know the maximum occurs at one of the endpoints 1 or infinity (where the function is zero) or at e.
e^(1/e) is clearly less than the square root of 3 since e is between 2 and 3, so it is clearly less than 2. Since this is the global maximum, n^(1/n) is always less than 2.
The induction proof seems nicer, to be honest.
right, n√n has max at e so all values 1
You can also show the limit of n^(1/n) simply with the epsilon delta definite
Let ε>0 choose N =2/ε^2 then for all n>N:
2/ε^2 < n
2 < nε^2
1 < 1/2 nε^2
n -1 < n(n-1)/2 ε^2
n < 1 + n(n-1)/2 ε^2
n < (1 + ε)^n
|n^(1/n) - 1| < ε
Instead of using 2 one could take also the maximal value for n^(1/n). Arguing that x^(1/x) is a continuous function we deduce that the maximal value must be finite. Then we dont need any further proof and get one over that maximal value instead of 1/2 as a common factor in the end.
Nice problem.
These series problems are a nice review of calculus while also extending it to deal with discrete objects.
No limiting argument is necessary: we can immediately observe that the nth root of n can never be < 1 (or else the nth power of that root would be less than 1) and can never be greater than 2 because of (2^n)/n > 1 for all naturals and take the nth root of both sides.
The only time when the left and right sides of the inequality k+1≤2k match is when k=1
The denominators in the general term behave for large n, by the criterium of comparing the sum to an integral from 1 to n, like n+(1/2)log^2 n+O(1)=n(1+o(1)) as n goes to infinity, and so the series diverges like the harmonic series.
This is a good series. It applies great use of a limit test with a comparison test to help test convergence.
Thanks a lot sir ... this was the coolest explanation I have ever seen .
Hey , Michael!! The thumbnail is wrong.
There are no root powers in the picture
On the board is the multiplicative inverse of the sum n^(1/n)
The 'dot dot dot' notation is confusing if not false
What you end up solving is the sum of the inverse of the partial sum of n^(1/n)
The series on the thumbnail converges
Proved this in calc 20 years ago. But good to see it again.
0:55 Some sort of idea for a plan of attack... Beautiful...
Curious, but if an armature, but because we know that sum_0^inf 1/x is divergent. And I could be very wrong about this, but wouldn’t any power of x which makes the denominator smaller also be divergent?
isn't fact that n root of n >= 1 enough to say it diverges?
really love it! keep it up sir
hmm, what if you multiply by (-1)^n, does that sum converge?
The series in thumbnail is different, clickbait?
Can we not use 1/n^p rule which says the series converges for p>1 ?
youtube makes enough money without include ads before, within math, and after math tuturials.
Take approximations of lower terms. 1 1.1 1.3 1.7 .. 1.99 so 2 power n.
What happens if you only have square roots instead of nth roots in the denominator, though?
Since any positive integer power of any positive integer is greater than or equal to one (and only equal to one in the trivial case), why did you allow for the nth root of n to be less than 1?
It didn’t matter, just as long as it remains positive so the direct comparison test can still be used.
I really REALLY need to learn to not watch your videos at 1:30am. It's fascinating, but the logical part of my brain that actually understands what the hell you're talking about is already asleep.
You can also show n^(1/n) < 2 with the derivative
d/dx (x^(1/x)) = -x^(-2 + 1/x) (-1 + log(x)) := f(x)
lim of f(x) to inf is 0 (easy to see)
lim of f(x) to 0 is also 0 (also easy to see)
now solve f(x)=0 => x=1
f'(x) = x^(-4 + 1/x) (1 - 3 x + 2 (-1 + x) log(x) + log^2(x))
f'(1) = -2 so this is our only maximum
1^(1/1) = 1 and 1 < 2
But I figured that, since the sum in the denominator was spiraling up towards infinity, that meant the sum had to converge to zero?
5:29 What prevents us to bound the n-th root of n by 1 itself? Isn't this the limit we argued for?
Mainly the fact, that the n-th root of something bigger than 1 (and thus the n-th root of n) is always bigger than 1 ;)
There is no use for it. It would only show that the series is less than the harmonic series, which diverges to infinity.
Actually you can use Stolz-Cesàro theorem for sequences instead of l'hospital's rule and not switch n to x
Still waiting for that sequel to virasoro algebra , calculating why h was -1/12
The fact that 1 in the denominator doesn’t have root sign is disturbing me.
What if instead of the n-th root of n in the denominator, we had square roots of n, would it then converge?
the sum of sqrt(k) from k=1 to n is bounded by (2/3)n^(3/2) from below and (2/3)((n+1)^(3/2)-1) from above, and the series of 1/n^(3/2) converges, so it would converge as well
You dont need induction to prove the lema: if n_sqrt(n) is equal or bigger than 2, then, n is equal or bigger than 2 to the n, wich is absurd.
what about the series
1/1+1/√2+...+1/(n)^(1/n)
(instead of having everything on the denominator)
The terms of the series don't tend to zero so it trivially diverges.
1/(1+√2+√3+...+√n)
What about this series?
@@sambhusharma1436 0
@@brunojani7968 series will be convergent or divergent?
@@sambhusharma1436 convergent, terms are like n^(-3/2)
7:13 Only equal at k=1
It should be obvious that n is greater than 1to the nth and less than 1+1 to the nth.
Too much time was spent on showing the nth root of n is between 1 and 2.
Is it just me or there's no audio?
Is there a series that is "less" than harmonic (i.e. a_n < 1/n after some n) but still diverges?
Yes, clearly. You can simply consider a_n = A/n for some 0
Remembering the proof that harmonic series diverges I guess there is. But then a less rigorous question would be: what other interesting series diverge and are less than harmonic?
Or famously, the sum of the reciprocals of the primes diverges as well.
@@buchweiz if you still want to index over all of the naturals, then the sum of 1/(n log(n+1)) diverges.
en.wikipedia.org/wiki/Large_set_(combinatorics)
asnwer=1 a mlddle isit matter
"And that's a good place to stop"
me: immediately press space to pause and ponder
*ads*
Me: SAD
If you're watching in a browser already, why not just install an ad blocker?
When an ad pops up and makes you forget what you just watched
I love induction
Very cool and simple!
That must be the simplest problem he has ever done...
What if you had (-1)^n in the numerator instead of 1?
1 over number clearly bigger than 1. How can this diverge? Cant understand this all
9:03
I like the rigour of using 'x' instead of 'n', so as not to create the impression of differentiating a function that isn't even continuous.
I would argue that the sequence with n (as a function N -> R) is indeed continuous, yet it is not differentiable. Still I agree, that using 'x' instead of 'n' helps with understanding that a sequence might be a function but not all functions are differentiable.
Just because something is larger than a diverging series doesn't necessarily mean it diverges though right so Inpwuldnt say it like that, but other than that the proof is sound. Did anyone else think he could've worded that better maybe?
1/(1+√2+√3+...+√n)
What about this series?
It's equal when k=1. Am I a real mathematician now, Professor?
You might be in the running for the Fields medal.
it s a pity that the thumbnail very often looks like, but is really very different from the problem discussed in the video.
That’s not the same serie as in the video thumbnail. Dislike for inaccuracy.
thanks
Thumbnail is off btw
Cool video!
PS : the URL contains "Joe". Joe who?
joe mama
For a infinite sum of a sequence to converge, the limit of the sequence must equal 0. So, when we have that the limit of the nth root of n equals 1, we have that the sequence that defines the series also has a limit of 1, and because the limit of the sequence is not 0, the series must diverge.
Can anyone tell me how to solve this if all the radicals are same that is either square root or cube root and so on ?
You just compare it with an integral. If you have square roots, then compare the reciprocal of the sum of square roots from 1 to n with the reiprocal of the integral from 0 to n of square root of x. You have to show that the reciprocal of the sum is smaller than the reciprocal of the integral and then when you compute the integral just use the p-series test to see that it converges and then the original converges by the direct compariso test.
@@ethanbottomley-mason8447 thank you very much sir
This seems like a parody to me. The N-root of a natural number is always greater or (if n =1) equal to 1.
Here's my proof of ∀n : ℕ, n > 0 . root n n < 2
root n n < 2 n < 2^n (which is obviously true, so we're on the right track)
0 < 1, but let's not take the 0th root of 0, so 1 < 2 is our base case
2^(n + 1) = 2 ⋅ 2^n = 2^n + 2^n
1 < 2^n => n + 1 < n + 2^n
n < 2^n => n + 2^n < 2^n + 2^n
=> n + 1 < 2^n + 2^n = 2^(n + 1)
Well I have understood 100%, but I would never ever come up with this on my own ... :-(
No it's converge
thumbnail sum in a sum???
It seems pretty obvious that nth root of n is between 1 and 2 since 1^n
That is false. 1^1 is not less than 1. They are equal.
Isn't there a known limit for lim nth Square root of any number is equal to 1? Am I too advanced in analysis to take it for granted ?😇
basically, all of calc math is known, though students are still supposed to prove everything by hand
@@ВасилийДрагунов-н8т oh yeah I forgot,it's long though😅
Lopietarl
7:00
So dirty
By a very non-rigorous approach, here's what I thought looking at this.
n^1/n is strictly larger than 1 for n>1. But obviously will get smaller as n grows as as the base increases linearly with n and the exponent decreases inversely. √2 < 2, So in all, the sum is between the harmonic series and twice the harmonic series. Both of those diverge, so this diverges too.
That would be n^(1/n), because of the Order of Operations.
@@robertveith6383 that's true, but I think it's pretty clear what I mean.
"our goal object is larger", doesn't that mean it can potentially converge? You still need to prove it's not as large the harmonic series. If it's larger than that, it's converging. It's easy to show, but I feel that last statement should be clarified. After all, you're upper bounding the denominator, while it's not imm clear what the lower bound is.
Edit (few days hence): my statement makes no sense, I mixed/confused con- vs divergent here…
He says the answer in the end diverges which means it doesn't converge because its bigger than a sum which divergers and the only thing bigger than infinity is infinity
No, since every term of the original series is greater (or equal) to the corresponding term in the harmonic series, summing all of them up must give a sum greater than that of the harmonic series (you would do this with the partial sums to be more rigorous). Since the harmonic series diverges, i.e. its sum "equals" ∞, we have that the sum of the original series is greater than "∞", and no real number satisfies such a property. This is sometimes called the minorant criterion, and it works in the opposite way for convergence as well. In that case, you look for a series known to be convergent where every term is greater than the corresponding term of the series whose convergence you would like to investigate. Provided both series have only positive terms, the majorant then establishes the convergence of the series you are interested in (this can be generalised at least to the extent of working with the absolute value of the terms and hence establishing absolute convergence).
"Doesn't that mean it can potentially converge?"
No. If a series diverges to infinity then a series of larger numbers will also diverge to infinity.
@@alexanderbasler6259 thanks, that’s an excellent explanation. @all Rereading my own comment, I must’ve confused the terms con- vs divergent. Seeing it again, it doesn’t make any sense at all to myself anymore 😵💫
Hi 🤩
2^e > 2^2 > e. Thus, 2 > e^(1/e). Since (x^(1/x))' = (x^(1/x))(1 - ln(x))/x^2, for x > 0, it follows that x^(1/x) 0.