Lagrange remainder I just way of expressing the difference between function and its approximation. There exists also Cauchy remainder, integral remainder and maybe more of them. I think junk is better term for this kind of junk.
This is the best proof of Taylor's theorem, in my opinion. The rest is ok, but none of them is so simple and yet so powerful to delve into the question of what functions can and can't be represented by an infinite polynomial. Dr.Peyam, I would recommend that you make a video on taylor polynomial uniqueness, it's a beautiful fact in math.
this is one of the most ingenious proof that I have seen in my life. thanks you very much. and also we can continue to taking integrals infinite times. doesn't we?
instead of a powerseries: f(x) = a0 + a1 x + a2 x² + a3 x³ + ... You can develop many functions as a kind of taylor product: f(x) = a0 * a1^x * a2^x² * a3^x³ * ...
That is important. Also please stop and stand aside for a moment before erasing the board so we can pause the video at that point if necessary and review anything we have not yet understood.
The constant M is bounded below by the maximum value of |f'''(x)| between x and a, you're correct. Choosing a more positive egative x value might require you to increase M. If f'''(x) is unbounded, then yes you could have it go for infinity for the "Junk" term to be valid for all x. This is in line with the expectation that using a lower order approximation for a function can result in an unbounded error as you move away from the point you use to make your approximation. e.g. my error will go to ∞ as x -> ∞ if I approximate a parabola with a line. (As a counter example, your error will not go to ∞ if you approximate, say, a sine function with a constant function like y = 0 or y=122)
A more sophisticated name for "Junk" would be Lagrange remainder.
"For the sake of brevity we will always refer to this remainder as junk" quoting Euler
Lagrange remainder I just way of expressing the difference between function and its approximation. There exists also Cauchy remainder, integral remainder and maybe more of them. I think junk is better term for this kind of junk.
probably something like O(f^(n)) since it might not have a n+1 derivative
Dude,i just wanted to tank you,You are saving people from wasting hours searching for a simple proof.❤
I always wanted this proof
I swear I've been looking for this since the moment my eyes fell on Taylor series in Calculus book in my high school.
Could you make a part 2 that addresses convergence?
This is the best proof of Taylor's theorem, in my opinion. The rest is ok, but none of them is so simple and yet so powerful to delve into the question of what functions can and can't be represented by an infinite polynomial. Dr.Peyam, I would recommend that you make a video on taylor polynomial uniqueness, it's a beautiful fact in math.
Love this proof Dr Peyam! And it's even much easier than the one they usually teach in my uni
What a swift Taylor Theorem proof!
Wow I've never seen this proof. Thanks for sharing! :)
this is one of the most ingenious proof that I have seen in my life. thanks you very much. and also we can continue to taking integrals infinite times. doesn't we?
Great! Multivariable Taylor proof now!
To get the multi variable version with f(x), apply this video to g(t) = f(tx), where t is real, and set t = 1
@@drpeyam thanks!
Wow you are so right about that proof. Soooooo nice. Sincere thanks.
Dr Peyam,. Is here contemplated some assumed, that the function f is of class C k, in connection with the JUNK?
do we lack anything besides induction on n and the observation that f is at least C(n-1)?
Excellent. Thank you very much.
instead of a powerseries: f(x) = a0 + a1 x + a2 x² + a3 x³ + ...
You can develop many functions as a kind of taylor product: f(x) = a0 * a1^x * a2^x² * a3^x³ * ...
the historical proof was about newton.lagrange interpolations done at nearly the same point.
Who's the first guy in the meme from the thumbnail?
Damn ross Taylor in the thumbnail 😂😂
Taylor ..... New-Zealand :)))
Awesome!!!!
I don't follow the step where $\int_a^x f'(a)$ becomes $f'(a) (x-a)$. Why can we do that?
f’(a) is a constant (with respect to x), so just pull it out
@@drpeyam Ah, of course. Thanks!
enjoyed!
Taylor Theorem never worked for me. My junk is probably just too big.
Simply delicious!
Sir just a kind request. while writing on the board try not to cover up the board.
I like you videos AF.
Thank you : )
Love from india.😘
That is important. Also please stop and stand aside for a moment before erasing the board so we can pause the video at that point if necessary and review anything we have not yet understood.
But if you do it like this, then the constant M depends on how close x is to a. If you wanted it to hold for all x, M could go to infinity, right?
The constant M is bounded below by the maximum value of |f'''(x)| between x and a, you're correct. Choosing a more positive
egative x value might require you to increase M. If f'''(x) is unbounded, then yes you could have it go for infinity for the "Junk" term to be valid for all x. This is in line with the expectation that using a lower order approximation for a function can result in an unbounded error as you move away from the point you use to make your approximation. e.g. my error will go to ∞ as x -> ∞ if I approximate a parabola with a line. (As a counter example, your error will not go to ∞ if you approximate, say, a sine function with a constant function like y = 0 or y=122)
Someday, i'll understand. :):
cool
You made me laugh 😂 when you said the junk…..
mind = blown
This is looks like wavelet analysis, sum and difference
Finally
“Junk is very small…” hmmm…
u dont pre assume something u just go tep by step and do it
7:20 do you get it XD
Ross Taylor is always the 4th term
Right no this video has 110 likes and *ZERO* dislikes
You have to be darn good to pull that off in 2019
Yay!!! 😄
taytay series
Please solve the infinite product of x=1 to infinity of cos(pi/(x+2))
Show Integration of 1/(x^x) from 0 to 1 =1.2....
u are don