What Lies Between a Function and Its Derivative? | Fractional Calculus

Поділитися
Вставка
  • Опубліковано 27 кві 2024
  • Can you take a derivative only partway? Is there any meaning to a "half-derivative"? Does such a concept even make sense? And if so, what do these fractional derivatives look like?
    Previous video about Cauchy's Formula for Repeated Integration:
    • How to do two (or more...
    A really nice video that derives the gamma function from scratch:
    • How to Take the Factor...
    =Chapters=
    0:00 - Interpolating between polynomials
    1:16 - What should half derivatives mean?
    3:56 - Deriving fractional integrals
    8:22 - Playing with fractional integrals
    9:12 - Deriving fractional derivatives
    13:53 - Fractional derivatives in action
    16:19 - Nonlocality
    17:54 - Interpreting fractional derivatives
    18:51 - Visualizing fractional integrals
    22:10 - My thoughts on fractional calculus
    23:10 - Derivative zoo
    ===============================
    MAIN SOURCES USED FOR THIS VIDEO
    Podlubny, Igor. Fractional Differential Equations: An Introduction to Fractional Derivatives, Fractional Differential Equations, to Methods of Their Solution and Some of Their Applications. Academic Press, 1999
    Podlubny, I.: "Geometric and physical interpretation of fractional integration and fractional differentiation." Fractional Calculus and Applied Analysis, vol. 5, no. 4, 2002, pp. 367--386.
    - (for the visualization trick for fractional integrals)
    Edmundo Capelas de Oliveira, José António Tenreiro Machado, "A Review of Definitions for Fractional Derivatives and Integral", Mathematical Problems in Engineering, vol. 2014, Article ID 238459, 6 pages, 2014. doi.org/10.1155/2014/238459
    - (for the zoo of alternative fractional derivatives)
    ===============================
    Minor correction: The footnote at 7:34 should say the trig substitution produces another whole factor of pi (not a root pi) in the numerator which then cancels the two root(pi)'s that appear in the denominator from applying the half integral formula twice.
    ===============================
    CREDITS
    This video uses the song "Rubix Cube" coming courtesy of Audionautix.com
    ===============================
    Want to support future videos? Become a patron at / morphocular
    Thank you for your support!
    ===============================
    The animations in this video were mostly made with a homemade Python library called "Morpho". If you want to play with it, you can find it here:
    github.com/morpho-matters/mor...
    ===============================
    This video is part of the 3Blue1Brown Summer of Math Exposition 2 (#SoME2). You can find out more about it here:
    summerofmathexposition.substa...

КОМЕНТАРІ • 1,1 тис.

  • @SurfinScientist
    @SurfinScientist Рік тому +2645

    When I was a high-school kid I tried to derive fractional derivatives and integrals, but I didn't have sufficient knowledge to succeed at the time. I never thought about it later, even though I studied math at university. Until I saw this video. What an excitement you gave me by making it! Thanks a lot!

    • @pa.l.2499
      @pa.l.2499 Рік тому +94

      I first was exposed to the Riemann-Liouville Fractional Integral at The University of S&T while the others at a Chiefs Superbowl party kept track of the score on the other side of the room.
      Meanwhile a good man explained the content in this video on a scrap piece of paper when the subject arose concerning fractional derivatives during our corner math party.
      We also had some gifted people there explaining the Navier Stokes boundary conditions and later QFT, so I missed the majority of the game.
      I miss those days.

    • @danielchin1259
      @danielchin1259 Рік тому +16

      Never forget your good ol days

    • @achtsekundenfurz7876
      @achtsekundenfurz7876 Рік тому +34

      About the last part where order matters, I suspect that the dreaded "+c" of integration is to blame for that one.
      In school, somebody said, "Why not omit the c and only use it when you have to match data, e.g. need the one integral whose value at a certain X is Y" - But it turns out to be essential. For example, integrating 3(x+1)^2 gives (x+1)^3 + c, and integrating 3x^2 + 6x + 3 gives x^3 + 3x^2 + 3x + c, and the only way to get those to match is to realize that the c's are different. Without c, you'd get (x+1)^3 = x^3 + 3x^2 + 3x, which would imply 1 = 0 for x = 0.

    • @thatguyalex2835
      @thatguyalex2835 Рік тому +10

      I thought fractional derivatives were the average of the exponents and the coefficients. I was completely wrong... :) That's the beauty of math.
      Like (x^2 + x)/2 or something... Calculus proves me wrong.

    • @peterfireflylund
      @peterfireflylund Рік тому +10

      I spent a lot of time in high school trying to come up with a good definition for negative and fractional factorials - and got absolutely nowhere. I was shocked to later learn they were used for fractional integrals and differentials.

  • @minimath5882
    @minimath5882 Рік тому +1137

    Half the derivative, twice the fun

  • @sharpnova2
    @sharpnova2 Рік тому +969

    your analogy between comparing fractional calculus to integer calculus and interpreting e^ipi as repeated multiplications of e is perfect

    • @thatguyalex2835
      @thatguyalex2835 Рік тому +34

      Sadly, useful analogies, Numberphile, 3Blue1Brown and this video are things they never teach in a college math class, or the calc class I was in back in 2018-19. :(
      Also, it is pretty rad that half derivatives are a thing. Wonder what the all the use cases of fractional derivatives in physics are. :)

    • @maxthexpfarmer3957
      @maxthexpfarmer3957 Рік тому +17

      there aren't many uses
      yet

    • @naginchand2687
      @naginchand2687 Рік тому +5

      Power functions are stated to be one of the solutions of fractional differentiation.

    • @Gr3nadgr3gory
      @Gr3nadgr3gory Рік тому

      If only it helped at all.

    • @alanbanh
      @alanbanh Рік тому

      *integral calculus

  • @karan_jain
    @karan_jain Рік тому +2175

    Wonderful video! Is it possible to take complex-valued derivatives? What would they mean?

    • @alex_bor
      @alex_bor Рік тому +470

      I just know that I'm now going to spend way to much time trying this

    • @mujtabaalam5907
      @mujtabaalam5907 Рік тому +624

      Well the ith derivative, applied twice, wpuld need to be equal to the integral

    • @alex_bor
      @alex_bor Рік тому +359

      @@mujtabaalam5907 I honestly haven't gotten around to it yet but wouldn't applying it twice be the same as 2i?
      Same way as 1 derivative twice would just be the double derivative

    • @Devesteter252101
      @Devesteter252101 Рік тому +285

      @@alex_bor I don't think this winds up being an interesting question. The reason the question of fractional derivatives is good is since the rationals "lie between" the integers, which means we can ascribe to them the meaning of interpolating between integer derivatives in a way that respects composition (i.e. three repetitions of the 1/3rd derivative === a derivative).
      However, when we consider the question of imaginary derivatives, there's only one derivative that is pre-defined, that being the trivial 0-derivative (the identity). So the only properties we have that could pin down the meaning of the imaginary derivative is that
      1. The i m-th derivative followed by i n-th derivative equals the i (m + n)th derivative
      2. The i m-th derivative followed by the -i n-th derivative equals the original function
      The problem is neither of these properties are sufficiently specific to come close to uniquely defining an imaginary derivative - there are infinitely many operators that would satisfy these properties, including the standard fractional derivative! This lack of specificity is because the complex numbers don't serve as an extension of the integers in the gap-filling way that the rationals do. Another ugly corollary of this is that a trivial imaginary derivative that leaves the function unchanged would also satisfy the desired properties for an imaginary derivative.
      tl;dr: It is meaningless

    • @MrAlRats
      @MrAlRats Рік тому +64

      Yes. It's possible to extend the domain of the powers of the differintegral operator to the entire Complex plane. Each power of the differintegral operator should be thought of as a particular means of transforming one function into another function in such a way that the transformation is similar to that produced by nearby values of that power.

  • @rarebeeph1783
    @rarebeeph1783 Рік тому +386

    At 7:34, really the sqrt(pi) on the inside and the outside combine into a full pi in the denominator, which then would presumably cancel with a pi in the numerator generated by a trig substitution required to handle (t-x)^(-1/2). Trig subs love to happen when you have simple square roots of the integration variable in the denominator, and where there's trig, there's pi.

    • @morphocular
      @morphocular  Рік тому +137

      Yep, I think you're right. My bad! When I chose to include that footnote, I think I glanced a little too quickly thru the solution to that integral and also forgot that there was already another factor of 1/root(pi) outside the integral from the previous half integral step. Good catch!

    • @valinorean4816
      @valinorean4816 Рік тому +2

      @@morphocular What about the finite difference calculus? Is there a square root of the unit delta operator?

    • @toddmatteson183
      @toddmatteson183 Рік тому +18

      This got me thinking about 3b1b's observation that, whenever choose(tau, pi) shows up unexpectedly, there's a circle hiding in there somewhere. I wonder what a geometric interpretation of a half-derivative might look like, and where the circle comes into play.

    • @123_king_me9
      @123_king_me9 Рік тому +12

      @@toddmatteson183 the pi that comes from trig substitution is simple enough. The connection between trig and circles is well known.
      I’m not familiar with the circle intuition behind gamma(1/2) = sqrt(pi) but I’m sure there is a pretty explanation.
      What gets me though is that trig substitution has to be used in these integrals. It just seemed to me like all the methods of solving integrals were just a toolbox that you had to figure out what tools to use for every problem. The fact that this meaningful class of integrals all happen to use trig sub in a way that happens to bring a pi out is crazy to me.

    • @shoam2103
      @shoam2103 Рік тому +1

      @@toddmatteson183 alternatively, there is also different alternatives for the Gamma function, which can give different constants

  • @siquod
    @siquod Рік тому +210

    Here's a signal processing perspective: The derivative operator is a linear filter with frequency response given by the identity function. To find the half derivative, simply use a filter whose frequency response is the square root function. Of course, the tricky part is to define Fourier transforms of arbitrary functions in a meaningful way so one can apply the frequency response. I guess one can use windowed versions of the functions, then let the window width go towards infinity.

    • @yitzakIr
      @yitzakIr Рік тому +3

      It looked like dampening to me too

    • @maxdominate2481
      @maxdominate2481 Рік тому

      Aren't there existing Fourier transforms "off the shelf" that could be used?

    • @f1uc1k1y1o1u
      @f1uc1k1y1o1u Рік тому +3

      this possibly sounds like a use case for the fractional fourier transform...
      If there were some connection between non integer fourier transforms and non integer derivatives
      my current lack of spectral analysis understanding isn't strong enough for this, but it can be after studying math some more hopefully

    • @zokalyx
      @zokalyx Рік тому +1

      the frequency response of a derivative is to multiply by iw (frequency times imaginary unit). It is a linear function of w.
      The square root thing seems correct. Using sqrt(iw) = 1/sqrt(2) (1 + i) sqrt(w) would work as a half derivative, at least in theory

    • @zokalyx
      @zokalyx Рік тому +1

      @@maxdominate2481 not really when you are using functions like x, x^2, etc, since their transforms diverge

  • @sekrasoft
    @sekrasoft Рік тому +12

    I'm happy the algorithm recommended this awesome video. It's like discovering another dimention. It's that moment you realize something absolutely new and your brain celebrates it like a new birthday.

  • @quintium1
    @quintium1 Рік тому +87

    2:54 Here, I found the formula c(a) = a!/(a-½)! for the coefficient, where a is the exponent and the factorial is expressed in terms of the Gamma function: a! = Г(a+1). Also it can be extended by replacing ½ with any fraction or number you want.

    • @anonymous_4276
      @anonymous_4276 Рік тому +1

      Yep

    • @CoconudHotpocket
      @CoconudHotpocket Рік тому +5

      Sadly it doesn't yield the same results as the devigral shown in the video for taking fractional derivatives close to order 1 of f(x)=x. I know this because I tried it (after deriving it myself before even seeing this video)

    • @SOTminecraft
      @SOTminecraft Рік тому +6

      Exactly! I was wondering why he didn't mention it. Of the formula n! /(n-k)! fails for n=k so there's the question of what to do with those case. Because of linearity, one simply has to consider derivatives of monomes to derivate analytic fonctions. From what I've seen it seems we obtain the same results as he does with the lower bound at 0,but I haven't checked much

    • @SOTminecraft
      @SOTminecraft Рік тому +3

      @@CoconudHotpocket yes, the formula n! /(n-k)! does not work for n=k when k is an integer so there is no solid ground to justify using it when k>n-1. I've obtain the same computations as him otherwise. It even works with negative values

    • @quintium1
      @quintium1 Рік тому +7

      @@SOTminecraft Are you sure? If you derive x^1 with k=1, the result is 1! / (1-1)! * x^0 = 1; which is indeed the derivative of x. Even with n=0 it kinda works, since 0! = 1 and (-1)! = infinity, so 0! / (-1)! = 0.
      The formula breaks with negative whole numbers though because the Gamma function returns infinities at those points and infinity/infinity is undefined. Weirdly enough, negative fractions actually work.

  • @inverse_of_zero
    @inverse_of_zero Рік тому +142

    first time viewer here. this video is incredible. thank you so much! despite having done three mathematics degrees, i never learnt or used fractional calculus. this video is such a beautiful summary. i would donate if there was a "Thanks" button. my only feedback is when you make little aside notes in the corner, please just keep them on the screen a couple of seconds longer. i will be following your channel closely, i hope it gets captured by the youtube algorithm!

    • @dineshkumarv9493
      @dineshkumarv9493 Рік тому +7

      Me too 🙋🏼‍♂️ first time viewer and I loved his video

    • @XnoobSpeakable
      @XnoobSpeakable 11 місяців тому +1

      There IS a thanks button

    • @davidpugh2374
      @davidpugh2374 6 місяців тому

      25:27 Also a first time viewer to this channel and I can only agree with the previous comment. Amazing video!!! Please continue, you’re doing great work.

  • @kodfkdleepd2876
    @kodfkdleepd2876 Рік тому +81

    The derivative and integral operators can be seen as smoothing and non-linear frequency scaling. Take the FT of your derivegral and you will essentially get a spectrum modification. For integer parameter the frequency "lines" up and so it enables constructive and destructive interference to properly take place canceling all the non-linearity.
    That is, you have what is essentially a convolution and then derivative and when you take the FT of such a thing you end up with a power scaling relationship w^(p-a) modifying the original spectrum.
    The point here that the non-local behavior is due to the process actually working in the frequency domain and it just simplifies for the integer case. The interpolation requires certain constraints at integer values so the line up with our traditional usage... hence the derivegral is a generalization that simplifies to our basic operators. It's just one form of interpolation as there can be no absolute generalization since any generalization can work. Hence the "interpretation" of some fractional derivative is going to simply be the specific mechanism in which the transform was designed.

    • @a.osethkin55
      @a.osethkin55 Рік тому

      Cool

    • @kodfkdleepd2876
      @kodfkdleepd2876 Рік тому +8

      @@a.osethkin55 Hopefully it is clear enough. I didn't explain it well but it is way more obvious by just taking the Fourier transform of the derivegral and realize it's nothing special and we just end up with a sort frequency modulation that happens to simplify for integer values of the parameter because of their relationship to frequency(analogous to 2*pi*k for k integer which simplifies(in the sinusoids) too 0 but for other k, specially irrational, it never simplifies).
      I'm too lazy to work it all out but he could make a video doing it and showing what really is going on. By doing it one can then actually see why the resulting function seem to have a lot of sinusoidal components(as these components don't null unless the parameter is an integer). It's sorta like having a component that goes to zero only if k is an integer(e.g., f(x) + sin(2*pi*k)*g(x) and being "confused" when we generalize k to be a non-integer because we end up with this extra stuff
      ).

    • @bohanxu6125
      @bohanxu6125 Рік тому

      Is there any more general reason why the derivative modulation in frequency domain is local in real space?.... beside the specific reason that derivative itself is local.
      By the way, this fractional derivative based on FT has to be "hacky"... in the sense that it probably have swept some relevant details under the rug (details like fractional power of linear operator having branch cut, or different boundary condition of the Hilbert space). Otherwise, there would be an unique way of defining fraction derivative, which is not true according to the video.

    • @kodfkdleepd2876
      @kodfkdleepd2876 Рік тому +3

      @@bohanxu6125 It's precisely due to the linearity required by the differential operator. For example, while any continuous curve can fit a given set of discrete points there is only one curve that minimizes the distance and is differentiable(In L2, without differentiable a spine would be optimal).
      So while it might seem hacky in some sense because it seems somewhat arbitrary, the "optimal" situation is generally what gives us some type of "natural" solution.
      So first of all, it is clear that any generalization opens up multiple possibilities. That is sort of the point of a generalization. Second, the means by which we generalize will have a direct influence on the outcome. Both of these should be obvious but it should be kept in mind in problems like these. Third, and maybe less obvious, is that even though some generalizations may not exhibit certain behavior, that behavior none-the-less underlay and control the behavior.
      E.g., Forcing zeros of polynomials to exist in all cases gives us the "complex numbers". That is, without any conception of anything but real numbers and polynomials over them, requiring solutions to all such polynomials will result in the complex field(obviously that is precisely how it came about). So in this case the point is that the complex numbers underlay the real polynomials even if we don't realize it and these complex numbers are simply a certain type of structure that must exist for it all to work out(for it to be a generalization but also for the reals to be a specification).
      The method in the video of generalization the derivative is to use the integral and the machinery of the integral then imposes it's will on the outcome. In fact, the integral he used is explicitly a convolution and, as you know, that is directly related to Fourier integrals.
      Another approach is to use the Cauchy integral but if you notice this is pretty much the exact same method used in the video except it is explicitly in the complex numbers and it makes it more obvious about the integer dependence. 1/(z-a)^n, when n is an integer this factor is distinctly different from when it is not in the sense that it is a single term in a Laurent series and since the Laurent series is essentially a Fourier series we effectively have a one term series expansion. When n is not an integer we end up with an infinite Fourier series and so will have many sinusoidal components.
      Basically no matter how you end up looking at it, if you want the derivative to behave as expected there will be certain requirements in it's generalization. That is, the generalization can't be arbitrary in the same sense that generalizing the realize to support zeros cannot be arbitrary. This should make sense in at least that we don't can't just have a new free parameter available: e.g., x^2 + 1 = 0 forces whatever solution for x to have certain properties... hence the generalization of R cannot be arbitrary. In an analogous sense(actually it's very much connected) generalizing D cannot be arbitrary and we are forced to abide by certain constraints.
      Is there far more going on? Of course. Fourier transforms themselves have a lot more going on under the hood. The more one digs the more one will find. I wouldn't call it "hacky" though. It's more like chopping down weeds and tree's in a forest and discovering a lost path that is in ruin but clearly existed in the past.
      One could probably frame it in terms of branch cuts in that one has a Riemann surface like sqrt(z) where for R+ the surface sheets all intersect. e.g., basically for z^(e) as e ranges from 0 to 1 produces a surface in which for 0 and 1 it collapses to 1 sheet but for various others it has multiple sheets. This goes back to the Cauchy integral perspective.

    • @hopfenhelikopter4531
      @hopfenhelikopter4531 Місяць тому

      ​@@kodfkdleepd2876wow you really have a great grasp of math! I would actually enjoy reading more paragraphs like those :O

  • @mohithraju2629
    @mohithraju2629 Рік тому +42

    Wow, this felt like my graduate level math class wherein we define stuff that we don't really understand and point out the different strange properties it satisfies.

  • @josephrissler9847
    @josephrissler9847 Рік тому +246

    The options for fractional derivatives remind me of Euclid's 5th axiom of geometry. Euclid hated that he had to explicitly state that parallel lines never intersect, but what he didn't realize is that this was required differentiate flat-plane geometry from hyperbolic and elliptic geometries. Had mathematicians discovered a system isoomorphic (is that the right word?) to his first 4 axioms but not in the context of geometry, then we would similarly find ourselves with multiple options for extending the theory.

    • @ga35am
      @ga35am Рік тому +3

      Is there evidence about Euclides's hatred that you mentioned?

    • @josephrissler9847
      @josephrissler9847 Рік тому +31

      @@ga35am I read the anecdote some time ago, so sadly I have no source to cite. "Hated" might be the wrong word to use. I believe it was stated that he tried to derive his 5th axiom as a theorem implied by the first 4. Of course, he was unable to do so, because the 5th axiom is an example of an undecidable statement.

    • @createyourownfuture5410
      @createyourownfuture5410 Рік тому

      Can you please explain that in simple words?

    • @josephrissler9847
      @josephrissler9847 Рік тому +17

      @@createyourownfuture5410 Euclid was trying to write down the rules for geometry. One of his rules says that non-parallel lines intersect. He and others thought this rule should follow naturally from applying his other rules, but no one could prove it. It turns out this is only true for geometry on a flat plane. If you do this on something shaped like a saddle, for example, those lines might not intersect, and on a sphere, they will always intersect, even if parallel. Changing out that rule gives you a set of rules for different types of geometry.

    • @createyourownfuture5410
      @createyourownfuture5410 Рік тому

      @@josephrissler9847 buuuuut... The latitudes of the earth don't intersect?

  • @Mutual_Information
    @Mutual_Information Рік тому +21

    Wow, this is beautiful. Really demonstrates how mathematicians are able to extend concepts beyond their original domain.. also shows how doing so can eliminate structure. Here we see the geometric meaning (kind of) disappears..

    • @VIue_
      @VIue_ Рік тому +4

      I think this is because we like to define these thing within their structures and then explore what those definitions mean. So when we start to define a concept past its original domain, that structure that was used to formulate the descision becomes vestigial. Like the multiplication example they gave.

    • @Mutual_Information
      @Mutual_Information Рік тому +2

      @@VIue_ I agree - before I knew anything about mathematics.. it's not something I would have expected. But it seems to be a very common maneuver

    • @kindlin
      @kindlin Рік тому +3

      @@Mutual_Information
      I think this is all a sign that, similar to what was talked about near the end of the video, we're missing some kind of higher order understanding that actually would show us some higher order, that the common integrals and derivatives we're used to are really just special cases of this larger symmetry that we have yet to discover.

  • @nerdsgalore5223
    @nerdsgalore5223 Рік тому +37

    This channel is super underrated! Your animations are beautiful and the narration is incredibly clear!

    • @justanotherguy469
      @justanotherguy469 Рік тому +3

      I just found this and I agree totally with what you are saying. Love finding stuff like this, its like Christmas.

  • @magicianky
    @magicianky Рік тому +47

    I remember asking my AP calculus teacher the same question, then my calc 2 professor as an undergrad. Three years later as a side project in my second semester of real analysis, I dug into it and even wrote a paper. Loved this topic.

    • @orang1921
      @orang1921 Рік тому

      If you took AP Calculus, why did you have Calculus II? I thought AP Calculus was equal to Calc I and Calc II

    • @magicianky
      @magicianky Рік тому +1

      @@orang1921 there are two AP calculus courses. AB covers up through applications of integration and BC covers integration techniques through infinite series and vectors, at least that’s how it was in the 90’s. So basically AB = Calc 1 and BC = Calc 2 depending on university.

    • @orang1921
      @orang1921 Рік тому

      @@magicianky Yeah, I thought AP Calculus AB still counted as up to Calculus II and BC was a bit of III

  • @DavidRTribble
    @DavidRTribble Рік тому +11

    It's been years since I actually did any integration (to help my son in high school calculus), but your explanation was very lucid and easy to follow. Makes me think that I've still got some pretty good math chops.

  • @cbbuntz
    @cbbuntz Рік тому +31

    Fractional derivatives are a lot easier to deal with in the fourier transform since any nth derivative is just a multiplication by a (w*i)^n term and a constant. Something I stumbled on myself is you can get similar results if you (matrix) transform a power series to fit the power series (plural) of e^(x*n). It's much more simple to just work with a fourier series, but I could post the matlab code if anybody wants to see what I'm talking about.

  • @alessandrocattapan
    @alessandrocattapan Рік тому +5

    I watched the whole video twice today. I just feel the need to say THANK YOU! This is at the same time beautiful, mysterious, fascinating and educational. Great value of my time. Please keep going!

  • @crimfan
    @crimfan Рік тому +6

    What a fantastic video and new math channel.
    I taught fractional finite difference operators in a time series analysis class and one of the things that the text mentioned was exactly the "no interpretation" issue. They said "there's no good interpretation to this, but it models long memory processes well, so... there it is."

  • @ariaden
    @ariaden Рік тому +62

    Two questions:
    1. What does infinitesimal derivative looks like (when the fraction is close to zero)?
    2. Which definition of fractional derivative plays nicely with the Fourier transform?

    • @a.osethkin55
      @a.osethkin55 Рік тому +4

      Think, QFT (quick Fourier transform) answers for it

    • @copperspike
      @copperspike Рік тому +28

      oh fuck, are we going to integrate over infinitesimal derivatives to get the total change through the process or some shit?

    • @cmilkau
      @cmilkau Рік тому +2

      About question two: the lower bound infinity would likely play nice with the Fourier transform, because I think that makes the partial derivative of a sin x just sin (x+πp/2).

    • @alpers.2123
      @alpers.2123 Рік тому +1

      Take limit on differintefral formula over order paramer p.

    • @alpers.2123
      @alpers.2123 Рік тому

      Another question is what is derivative of derivative function. You can ask this question recursively ad infinitum as well

  • @Noissimsarm
    @Noissimsarm Рік тому +33

    I loved this video, I want to learn more math!

  • @carstenmeyer7786
    @carstenmeyer7786 Рік тому +13

    7:36 You can normalize the integral via *x := t/2 * (u + 1)* to get rid of *t* and obtain a symmetric integration domain. The remaining integral to solve is
    *\int_{-1}^1 (1 + u^2) / \sqrt{1 - u^2} du = 3𝛑 / 2*
    A second substitution *u := sin(v)* will yield the result.
    *Rem.:* Thank you very much for this introduction to fractional analysis! It's amazing how similar the concept is to the extension of the derivative to generalized functions (aka _Schwartz' Distributions_ ).

  • @bastienmassion299
    @bastienmassion299 Рік тому +66

    Super cool video, I'd love to learn more about it! In one of my courses, I learned about fractional PDE's for diffusion, and it blew my mind. Can't wait for the next videos!

    • @valinorean4816
      @valinorean4816 Рік тому +4

      so you're saying there is an application? where can i read about it?

    • @bastienmassion299
      @bastienmassion299 Рік тому +12

      My teacher Emmanuel Hanert told me about his paper of pollen dispersion with Lévy-flights in order to predict the probability that transgenic organisms (like GMO's) contamine some non-transgenic organisms via bee pollinisation, in function of the distance between both organisms populations. In addition to the fractional diffusion PDE part, it contains stochastic terms with so-called Lévy-flights, so it is quite heavy mathematically speaking. The paper is called: "A Lévy-flight diffusion model to predict transgenic pollen dispersal" by Vallaeys et al.

    • @romeomatei5781
      @romeomatei5781 Рік тому +2

      The "n" order derivative formula is calculated following the geometric definition procedure, for the first derivative as the limit of the ratio:
      lim (f(x)-f(xo))/(x-xo), for x-->xo.
      Then it derives "n" times after the same procedure!
      A recurrence formula is found depending on "n", a positive integer.
      Through Mathematical Induction, it is established whether the equality is true or not. If it is true for (n+1) => it is also true for "n", where n is no. entirely positive.
      QUESTION :
      Why is the value 1/2 assigned for n at the end [n=1/2]?
      Why is that right?
      What is the meaning of this "derivative" of order 1/2?
      Please argue. Provide a correct answer. Not just formal calculation. OK?
      Eng. Matei Romeo Marian,
      27.08.2022

    • @JaGWiREE
      @JaGWiREE Рік тому +3

      Same here from Eli Barkai's non equilibrium stat physics course. Would love a video where someone explains the memory kernel and goes over the question of physical consistency for various sysetms fractional models have been applied.

  • @tobiasbergkvist4520
    @tobiasbergkvist4520 Рік тому +87

    Crazy idea that would be interesting to visualize: d^x f(x)/dx^x
    Essentially: Let the number of derivatives you take vary continuously with the input variable to the function.
    Could you use fractional derivatives to make something similar to taylor series? Would this converge more quickly or more slowly for the same number of terms?

    • @natediamond6312
      @natediamond6312 Рік тому +19

      I also considered it'd be really cool to see 3D contours where the Z dimension is the degree of the derivegral. I may try and do that.

    • @giansieger8687
      @giansieger8687 Рік тому +20

      based on the fact that these functions are non-local I‘m inclined to say that it wouldn‘t work. The whole point of the taylor series is that each power can be viewed somewhat isolated when you plug in the development-point (idk what to call it in english but the „normal“ case would be p=0) of any given derivative. Due to the fact that the half derivative would also include information of the whole domain in ways that I couldn‘t predict, I‘d say it‘s gonna spiral out of control. This could also mean that it converges much more quickly. I don‘t know at all which is correct, just my 2 cents.

    • @Double-Negative
      @Double-Negative Рік тому +11

      Yes, it is possible to use fractional derivatives to create a taylor-esque series (I tried it a long time ago) but it’s incredibly unintuitive to actually solve for the coefficients

    • @udhavvarma7097
      @udhavvarma7097 Рік тому +3

      That is the most evil thing i have seen in my lifetime.

    • @thegamerfromjuipiter7545
      @thegamerfromjuipiter7545 Рік тому

      @@giansieger8687 I’m not very far along in mathematics (in fact I haven’t studied Taylor series yet) but I believe the term you’re looking for in English is the center of the series. That’s what we’ve called it for power series in my calculus II class

  • @juliogodel
    @juliogodel Рік тому +4

    This is a awesome intro do fractional calculus! I would heartly recommned to transform this in a series of videos (as you already gave hint on future topics). Please continue.

  • @ACIDVENOM2501
    @ACIDVENOM2501 Рік тому +4

    by far the best video I saw regarding fractional derivatives, integrals and differintegrals, I mean derivagrals (I've seen only one before this) and I'd be eager to see the next episode (if it would come)
    thanks for the good work! 👍

  • @PeterBarnes2
    @PeterBarnes2 Рік тому +7

    I first learned about fractional derivatives watching Dr. Peyam's videos a few years ago. Since then I've been working on applying more than just square roots to differential operators. From what I've seen, they're just a really bad class of functions to do this with, because all the fractional monomials have these branches and asymptotes, it's all very messy.
    Exponential function of the derivative operator? Classic, that's your shift operator. Reciprocal of a linear function of the derivative? Generalized Laplace transform! It all works surprisingly well. Using just the shift operator definition and regular derivatives, then some fairly predictable rules for how to translate things into and out of integrals, you can work up a healthy repertoire of functions applied to derivatives.
    My crown jewel so far in all of this was uncovering a super secret identity! It's sort of like the mother of all generalizations of the product rule, way beyond the generalized Leibniz rule.
    As symbols,
    [f(D_x)] (g(x) * y(x)) =
    [ [g(D_z + s)]_{z=D_x} (f(z)) ]_{s=x} y(x)
    In words, for any function-of-derivative 'f' taken of a product of functions, one of those functions may be taken out, and applied as a function-of-derivative _of_ 'f.' There is the quirk in that, given away by the use of a dummy variable 's;' this is to ensure the operator is well-defined, because you really shouldn't mix the variable you differentiate with respect to in the operator itself, as it may become unclear with these highly non-linear function-of-derivative operators what's differentiated when. Technically there would be no notational problem, and I've over-notated the issue especially by using square brackets to denote these function-of-derivative operators, but alas.
    Within these studies, I've come closer and closer to difficult problems of little importance, as well as stumbled upon the triviality of finding an operator whose eigenvalues are the zeros of the Riemann Zeta function. Alas, I do not have the knowledge to determine or construct a space for such an operator to also be self-adjoint. It's not an easy topic to just dive into, especially without having taken a class beyond differential equations.
    One of those curious problems of little importance is finding a non-trivial differential (that is, a function-of-derivative) equation whose solutions include the gamma function. I've actually gotten really close! The equation:
    [e^e^-D_x]y(x) = 0
    should have the gamma function as a solution. I devised this using that above identity, specialized for a particular case (I've forgotten the original derivation of how to do this) of an operator where somehow you multiply by the independent variable.
    You see, this is actually impossible, because all of these 'function-of-derivative' operators (perhaps excluding peculiar cases of non-meromorphic functions, see what I said before?) are completely linear. They commute with each other. This is not conducive to having an operator where you put in a function of and get out x times that function. The problem is that this operator does not commute with the derivative:
    x * [D_x] f(x) = xf'(x)
    [D_x] (xf(x)) = xf'(x) + f(x)
    This is one of the reasons I don't like mixing the independent variable in the function-of-derivative operator, because it breaks commutativity.
    But I just said I did that. How? -Magic, basically.- I cheat by using a strange trick (which, as I mentioned before, I don't remember off-hand, but iirc it is based on that identity I presented at the beginning of this comment) where, given an operator whose 0-eigenfunction (eigenfunction of eigenvalue 0) is 'f,' I can find a new operator whose 0-eigenfunction is x*f.
    By arranging the terms right, the logical next step was to try and apply this to the functional equation for the Gamma function. I forget exactly, but iirc this should yield [e^e^-D_x]. This _almost_ works! There's a very non-rigorous but highly conclusive way of getting from this operator to the gamma function using a certain method (solving these kinds of DEs is often easy because it's just a sum of exponentials whose exponential coefficients are the zeroes of the operator (A method you may be familiar with for LDEs, which can be proven to generalize as such). The trouble is that e^e^-z has no (finite) zeroes, so you have to use a really tedious method that uses integration of a manipulation of what would normally be called the characteristic function, but is identically here the function that the derivative operator is taken of, and it's a hassle.) Except, it's not this operator, it's its evil twin that's out by a sign error. I don't remember exactly how it goes, but you can see it work flawlessly for that evil twin, and diverge for the one you arrive at correctly for that method I mentioned earlier. This frustrating issue has had me scour every step of what I've explained for a sign error, to no avail.
    If you don't want to work through all of this stuff that I have (inadequately) explained to see that the method really does _almost_ work, simply take the integral definition of the gamma function, and do a particular u-substitution, I think it might be u=e^-t, or maybe it was t=e^-u, and watch as you get this peculiar double exponential appearing. This is exactly what you'd expect for a function the solution to this function-of-derivative equation, and it is what you get by applying this integral of characteristic function method to that evil twin operator.
    On the other side of things, trying to evaluate these extremely strange function-of-derivative operators is hardly possible. Actually, it's fairly straightforward to do it for any function-of-derivative where the function has a definite integral representation, by which I actually mean where the independent variable isn't in the bounds. (The variable being in the bounds would obviously be pointless, because you replace the variable with the differential operator, and I have absolutely no definition for an "integral from 3 to the derivative operator"!) Many functions have such a representation, like the Gamma Function (a coincidence from earlier, that has no application as yet to the earlier problem) to take Gamma(D_x), the Riemann Zeta function to take Zeta(D_x) (this is related to but not sufficient for what I mentioned earlier about an operator whose eigen_values_ are the zeroes of this function), and indeed 1/(s-D_x) is equivalent to the laplace transform, but with the original independent variable still there.
    To give you a taste of how most of this works, I'll derive that last one, because it's a lot of fun:
    Notice that the integral from 0 to infinity of a negative exponential is the negative reciprocal of it's exponential coefficient, that is:
    int_{0, inf} e^-zt dt = 1/z
    We can use this as an integral definition for the function 1/z. If we alter this nifty function 1/z, we can get the rather versatile function 1/(s-z). Looking back this yields:
    int_{0, inf} e^(z-s)t dt = 1/(s-z)
    Replacing z with a differential operator D_x, we get
    [ int_{0, inf} e^(D_x-s)t dt ] = [ 1/(s-D_x) ]
    which is a perfectly typical construction. Notice the square brackets [ ] on the outside of the integral, which denotes that the integral is taken first, then function-of-differentiation is evaluated. We can relatively freely move those brackets inside for use as a definition for the simple reason that I've not bothered to make rigorous when you can't do that beyond just "whenever it would be okay for a perfectly linear operator." When we do this, we should separate the exponential terms out to get a better look:
    int_{0, inf} e^-st * [e^tD_x] dt = [ 1/(s-D_x) ]
    Now, as a function-of-derivative operator, I would leave it here, but to see why this is a generalized laplace transform, we should test it out on an arbitrary function 'f.'
    [1/(s-D_x)] (f(x))
    = int_{0, inf} e^-st * [e^tD_x]f(x) dt
    The coolest part of all this study is how commonplace the fact is that the exponential function of a derivative is actually the shift operator, which otherwise is relegated to the characteristic functions in the niche subject of Delay Differential Equations.
    = int_{0, inf} e^-st * f(x+t) dt
    This is our familiar Laplace Transform. Except... isn't it supposed to be just f(t), not f(x+t)? Hehehe, indeed. Quirky, eh?
    (You can show this works as a definition by applying [s-D_x] to one of these generalized Laplace Transforms on your favorite functions. It's cool! And has effective but disappointingly limited and tedious applications to solving typical constant-coefficient LDEs.
    I hope this small youtube comment made some maths enthusiasts a little more intrigued in the peculiar side of calculus.

    • @alpers.2123
      @alpers.2123 Рік тому +1

      Too long didnt read but as i know study of functions of functions such as square root of a function is called functional calculus. So half derivative is a square root of derivative function. Other functions over differentiation can also be defined as well. I think exponent of derivative you were talking about was something related to functional calculus

    • @PeterBarnes2
      @PeterBarnes2 Рік тому +2

      @@alpers.2123 Yes, I've had a cursory look at spectral calculus, though I hadn't heard of functional calculus. This would appear to be the holomorphic calculus on derivative operators.
      What I commented about and have been working on is essentially the applications of this idea to complex analysis, for example that identity that a function the reciprocal of a linear term evaluated at the derivative is a generalization of the Laplace transform, which I demonstrated at the end of my comment.
      I've recently found, by applying that identity I explain early in my original comment, that one can take arbitrary fractional derivatives of some functions by reversing the roles:
      [D_b^p] f(b) = lim{x->0} [f(D_x)] (x^p * e^bx)
      Using this, it's almost trivial to show, using the complex definition of sine, that its fractional derivative is as:
      [D_b^p] sin(b) = sin(b + p(pi/2))
      Which holds up on integers clearly, while also presenting a more pleasing form of the fractional derivative than that given in the video, as all the fractional derivatives are still sine functions: no funky business around the "lower bound," as there is no such bound!
      Unfortunately, this definition of the fractional derivative both: requires defining the function of a derivative, and simply diverges for fractional derivatives of monomials (and thus polynomials, as it is linear). The first requirement, though, is easily satisfied for sums of exponentials (as evident by the example of sine), and consequently definite integrals of exponentials ('definite' being more restricted as per outlined in my original comment).

  • @troppapolvere
    @troppapolvere Рік тому +4

    This lesson was extremely interesting and the realization is beautiful! A big thank you for making it!

  • @newsgo1876
    @newsgo1876 Рік тому +3

    Though I don't fully understand the whole video yet, the last sentence is enlightening: "...a deeper appreciation for the clever techniques mathematicians use to extend concepts to domains where they at first don't seem applicable, and the fascinating things that can result ---- a process that is very much a part of the spirit of modern math."

  • @ericvosselmans5657
    @ericvosselmans5657 Рік тому +9

    What amazes me most of all about fractional derivatives is that they have actual real world applications in physics.

    • @JustaReadingguy
      @JustaReadingguy Рік тому +2

      Good grief. Please site some examples.

    • @Alexander-jh4ek
      @Alexander-jh4ek Рік тому +1

      Such as?

    • @ericvosselmans5657
      @ericvosselmans5657 Рік тому +2

      @@Alexander-jh4ek@ JustreadingGuy Citing from wikipedia on fractional calculus
      Fractional derivatives are used to model viscoelastic damping in certain types of materials like polymers.
      The propagation of acoustical waves in complex media, such as in biological tissue, commonly implies attenuation obeying a frequency power-law. This kind of phenomenon may be described using a causal wave equation which incorporates fractional time derivatives:

    • @ericvosselmans5657
      @ericvosselmans5657 Рік тому +3

      and a lot more. The wiki page on fractional calculus is a good starting point

  • @clearmath2359
    @clearmath2359 Рік тому +1

    I just LOVED this video! The animations, the topic and the way it's presented are just brilliant!

  • @NeilEnnis
    @NeilEnnis Рік тому +1

    Congratulations for creating an entertaining video about a complex topic. I had never even thought of fractional calculus before watching this. Thankyou!

  • @5c0ttyd
    @5c0ttyd Рік тому +13

    One of the most well explained and interesting maths videos I've seen for a long time. I'd heard of fractional calculus before but have not done much calculus since I was an undergrad, 15 years ago. This was really enlightening. Thanks!

  • @Bjowolf2
    @Bjowolf2 Рік тому +30

    Great stuff - thank you 😊
    Personally I think it's easier to understand these half etc. derivatives in the spectral domain ( via Laplace or Fourier transforms ), where you differentiate by multiplying by a "damping" and / or "frequency"-variable ( S or omega ).
    The half derivative is then equivalent to multiplying by the squareroot of this variable ( S^(1/2) or Omega^(1/2) ) - i.e. amplifying the higher frequencies of the spectrum of the function by "half" of what the normal derivative does.
    And here applying this half derivative technique twice in a row will indeed produce the same result as the full derivative.

    • @DrunkenUFOPilot
      @DrunkenUFOPilot Рік тому +4

      In audio, pink noise is the semi-integral of white noise. The filters are designed in the usual spectral space.
      I've enhanced images from Cassini using semi-derivatives, done in Fourier space, just as you describe. Most people who work in image processing, as far as I can tell, haven't heard of this technique.
      Fun stuff!

    • @jesusandrade1378
      @jesusandrade1378 3 місяці тому

      You should see Michael Penn"s video of fractional derivatives that use Laplace Fourier Transforms.

  • @rushildutta7830
    @rushildutta7830 Рік тому

    Such a great video! I was really amazed to see the results one can yield with fractional derivatives. Also, your animation skills are so on point. I love this video, so interesting!

  • @ivanshmarov2866
    @ivanshmarov2866 Рік тому +7

    Very neat video. Although I am not an expert on the subject myself, I would argue that the "non-locality" of this fractional derivative is precisely the byproduct of the Riemann-Liouville form, and therefore not a property of all fractional derivatives. For common functions, like x^n, exp and sin/cos, it is possible to construct fractional derivatives using analytic continuation, which are perfectly local since there is no dependence on the integration limit "a". In this case, for example, a fractional derivative of d^k / dx^k of cos(x) becomes cos(x+pi*k/2), which is exactly the shift along the x-axis.

    • @mr.gentlezombie8709
      @mr.gentlezombie8709 Рік тому +1

      Also for most other functions, you can write them as taylor series, which should still have the nice properties that polynomial fractional derivatives do.

  • @cyanmargh
    @cyanmargh Рік тому +14

    I think more beautiful way to find half-derivative is using fourier transform: dg^n / dx^n = F^-1[F[dg^n / dx^n]] = F^-1[(iw)^n * F[g]]

    • @cyanmargh
      @cyanmargh Рік тому +1

      @@DrDeuteron hm, don't sure about FT, but for DFT it's possible, because DFT is just matrix multiplication, all we need is matrix in fractional power. And this is possible with Taylor series. Maybe, there are something similar for FT.
      Надеюсь, правильно написал всё.

    • @cyanmargh
      @cyanmargh Рік тому +1

      @@angelmendez-rivera351 matrix exponent and logariphm are defined, so there are no problem. M^x = exp(log(M)x)

    • @cyanmargh
      @cyanmargh Рік тому

      @@angelmendez-rivera351 sorry, didn't noticed that formula lnM = sum of -(M-I)^k / k converges only for |M-I|

  • @germaindesloges5862
    @germaindesloges5862 Рік тому +3

    Thanks so much! I wondered about this ever since I first learned about derivatives

  • @juchemz
    @juchemz Рік тому +17

    Awesome video. I would have loved to see a graph of mu_t(x) at 19:40 to get an intuitive sense of how the scale factor depends on x and on the original curve, and how that gives rise to the asymptote.

    • @Pystro
      @Pystro Рік тому +1

      Same. I would like to see a 3d plot with x and x~ on the horizontal axes and f(x) on the vertical one. Then applying that transform would mean projecting from the x,f(x) plane onto the "plane" that is defied by the curve x~(x), and looking at the area from the x~,f(x) direction.
      I suspect that could be related to a square root, something like sqrt(x), since picking up that factor twice would give x~=x, thus taking you back to the first order integral.

  • @apuji7555
    @apuji7555 Рік тому +2

    This is a really well-made video! Great job!

  • @bohanxu6125
    @bohanxu6125 Рік тому +6

    When I first heard of fractional derivative, the first thing that come to my mind is that it can be defined after Fourier Transformation. A function becomes a vector of coefficients after Fourier transformation. The differential operator simply becomes a linear operator. A fractional linear operator can then defined thorugh L^n=U(D^n)U-1 where D is L after diagonalization.

  • @rafaellisboa8493
    @rafaellisboa8493 Рік тому +3

    Just discovered your channel, you are very good at making these videos!! So interesting and well explained

  • @michaelzumpano7318
    @michaelzumpano7318 Рік тому

    Wow that was cool, and thought provoking. Your pace and detail were perfect for me. I’ll check out your other videos.

  • @rafaelortega1376
    @rafaelortega1376 Рік тому +2

    Amazing work. Perfect for IB students to write their internal assessment.

  • @bouipozz
    @bouipozz Рік тому +3

    This makes the choice of notation behind fractional powers much clearer to me, thinking of square roots as half multiples makes a lot of sense.

  • @danielfarbowitz671
    @danielfarbowitz671 Рік тому +3

    Awesome! I've wondered about fractional derivatives, so it's nice seeing it laid out like this.

    • @romeomatei5781
      @romeomatei5781 Рік тому +1

      The "n" order derivative formula is calculated following the geometric definition procedure, for the first derivative as the limit of the ratio:
      lim (f(x)-f(xo))/(x-xo), for x-->xo.
      Then it derives "n" times after the same procedure!
      A recurrence formula is found depending on "n", a positive integer.
      Through Mathematical Induction, it is established whether the equality is true or not. If it is true for (n+1) => it is also true for "n", where n is no. entirely positive.
      QUESTION :
      Why is the value 1/2 assigned for n at the end [n=1/2]?
      Why is that right?
      What is the meaning of this "derivative" of order 1/2?
      Please argue. Provide a correct answer. Not just formal calculation. OK?
      Eng. Matei Romeo Marian,
      27.08.2022

  • @omaral-absi5478
    @omaral-absi5478 Рік тому +1

    This channel has great potential! Keep it up

  • @SagiWriting
    @SagiWriting Рік тому +1

    Excellent video! Liked it a lot.
    However, there is one mistake in the video that must be addressed: we DO know what fractional derivatives mean! It is simply cannot be described as a "tangent line" or an "area under the curve".
    Think about it that way: 0-degree derivation means keeping the function as it is. 1-degree derivation means looking at CHANGES of the function. A fractional degree derivation means how much MEMORY of the original function we want to keep.
    For example, 1/2-degree derivatives mean we want to balance equally between remembering the original function to looking at its changes.
    1/3-degree derivative means we give 1/3 of the weight to the original function and 2/3 of the weight to changes.
    This is extremely helpful in many fields. Personally, I use it when researching the stock exchange. If I look at a stock price, 2 things interest me: the price of the stock and how it changes over time. In the end, predicting one of them will help me to predict the other. It is have been proven that many times predicting a fractional derivative of the stock price is easier than predicting the full 1-degree derivative or the 0-degree derivative (the actual stock price)!
    So that's my point: we know what fractional derivatives mean - it's the balance between looking at the changes of the function to the function as it is.

  • @mskellyrlv
    @mskellyrlv Рік тому +2

    Wonderful exposition! I wish I had ever had a math professor who could explain things as well.

  • @jafetriosduran
    @jafetriosduran Рік тому +8

    This is one of the greatest videos I've seen explaining fractional derivatives, maybe I'll do one approximating de operator using products of transfer functions, your content is beautifully explained

    • @julius4858
      @julius4858 Рік тому +1

      How many videos on fractional derivatives have you seen, lol

    • @jafetriosduran
      @jafetriosduran Рік тому +1

      @@julius4858 in Spanish there are some like this

    • @jafetriosduran
      @jafetriosduran Рік тому

      ua-cam.com/video/6zeF9bNalmI/v-deo.html from matesmike

  • @lenakuse6304
    @lenakuse6304 24 дні тому

    Wow! I just found this video and want to say thank you for helping me out with that stuff and making it fun at the same time! Great work

  • @tobylerone007
    @tobylerone007 Рік тому +1

    Mind blown! What a superb presentation :)

  • @pedroivog.s.6870
    @pedroivog.s.6870 4 місяці тому +10

    Just wait until they define irrational and complex derivatives

  • @atoolforcreativity1985
    @atoolforcreativity1985 Рік тому +4

    This is the best #SoME2 video I've seen so far!

  • @zhuolovesmath7483
    @zhuolovesmath7483 Рік тому

    Wonderful video. Concepts are very well explained, and the video ends with a deep reflection on modern mathematics. Truly well made!

  • @usernameisamyth
    @usernameisamyth Рік тому +1

    A lot to grasp but it was amazing
    Looking forward to your upcoming videos

  • @Gotonis
    @Gotonis Рік тому +12

    Alright, let's all agree that if one of us comes up with a novel and useful formulation of fractional derivatives it will be named the Voldemort Derivative

    • @morphocular
      @morphocular  Рік тому +5

      Well, it'd be better than naming yet another thing after Euler :)

    • @Kirillissimus
      @Kirillissimus Рік тому +2

      @@morphocular I strongly believe that the only proper fractional drivative should be referred to as "Laplace Derivative" as we get our derivatives by multiplying a Laplace image of a function by jω (or "s" or "p" or whatever else you like to call your Laplace operator). For fractional derivatives it makes no difference - for example if you want a ½-derivative then you just multiply by the square root of jω. The meaning of the operation also gets obvious if you look at it in the frequency domain.

  • @VanWarren
    @VanWarren Рік тому +6

    this is fantastic:
    the non-locality principle reminds me of the Google Attention Transformer Improvement to Recurrent Neural Networks and LSTM's in machine learning and
    the convolution integral in Laplace transforms for control theory.
    The attention transformer does use a periodic function to solve an internal sampling problem, and I am curious if there is a relationship.

  • @vvvvaaaacccc
    @vvvvaaaacccc Рік тому +1

    it's interesting that there are multiple (infinite?) valid sequences of functions between a derivative/integral pair. thanks for this thorough introduction!

  • @user-jr2fi7wq4z
    @user-jr2fi7wq4z 9 місяців тому

    really amazing video!! easily accessibe and inspiring! Looking forward to more videos about this topic!

  • @sh4dow666
    @sh4dow666 Рік тому +11

    Very interesting video! I thought about this topic on my own a while ago, and when trying to derive the gamma function had another interesting, somewhat related idea: instead of differentiating (taking the ratio of the x and y differences as dx approaches 0), you could 'quotientate' a function by computing the nth power of the quotient between two y values ("ry", so the operator would be "ry^dx"). For an exponential function, I'd expect the resulting function to be a constant, while for the gamma function, its 'quotientation' should be a linear function. My school calculator always hung up when I tried to derive gamma that way, but I'd find it really interesting to know whether there are any applications for this kind of operator, how well-defined it is, and whether that notion could be generalized further.

    • @Bruno-el1jl
      @Bruno-el1jl Рік тому +3

      I love these kinds of ideas! Would be cool to see explored in a similar video such as this

  • @spinning_peridot3673
    @spinning_peridot3673 Рік тому +3

    I've been to a conference a few months ago, where in a talk about plasma astrophysics, fractional derivatives have been used in their formula. I've been surprised, since me (and apparently my colleagues) have never heard of this concept before. So this is a great explanation for me to see what's going on 🙂

    • @spinning_peridot3673
      @spinning_peridot3673 Рік тому +1

      @@sleepywatcher3 I tried to find the slides from the talk, and I actually did. Fractional derivatives have been used in differential equations describing interstellar diffusion, more exactly, in the terms that describe anomalous diffusion.

  • @Cyclone1335
    @Cyclone1335 Рік тому +1

    That was really fascinating. Thank you!

  • @shacharh5470
    @shacharh5470 Рік тому +1

    Excellent video, looking forward to more on the topic

  • @shahargov5014
    @shahargov5014 Рік тому +4

    Great video. I was wondering about fractional derivative myself in the past. I then came up with the following scheme: Relying on the fact that derivative operator translates to multiplication by frequency upon Fourier transform (FT) of the function, One can convert this complicated operation into a simple one at the fourier space...so say, taking "half" derivative of the function, yields a new function that can be evaluated as follows: FT the function, so function of time now becomes function of frequency f, multiply results by sqrt of 2*pi*f*i, and then Inverse-Fourier-transform the result.

    • @jackwilliams1468
      @jackwilliams1468 Рік тому +2

      This is what I was thinking. I tried using ladder operators from the quantum harmonic oscillator to get a sort of "half step" of a derivative, it's basically using derivatives as steps between hermite polynomials instead of trig functions (this sounds ugly but it's just linear algebra). All this felt a lot like finding half powers of operators to me

    • @DrunkenUFOPilot
      @DrunkenUFOPilot Рік тому

      @@jackwilliams1468 If you're fooling around with Hermite polynomials and quantum harmonic oscillators - you have the tools for exploring the fractionally iterated Fourier transform! I hadn't thought of fractional derivatives in relation to the ladder operators though.

  • @rohan.fernando
    @rohan.fernando Рік тому +11

    The property of ‘memory’ in fractional derivatives you’ve identified seems similar to the impact of initial/prior conditions on fractal iterative calculations.

    • @a.osethkin55
      @a.osethkin55 Рік тому

      Bingo

    • @romeomatei5781
      @romeomatei5781 Рік тому +1

      The "n" order derivative formula is calculated following the geometric definition procedure, for the first derivative as the limit of the ratio:
      lim (f(x)-f(xo))/(x-xo), for x-->xo.
      Then it derives "n" times after the same procedure!
      A recurrence formula is found depending on "n", a positive integer.
      Through Mathematical Induction, it is established whether the equality is true or not. If it is true for (n+1) => it is also true for "n", where n is no. entirely positive.
      QUESTION :
      Why is the value 1/2 assigned for n at the end [n=1/2]?
      Why is that right?
      What is the meaning of this "derivative" of order 1/2?
      Please argue. Provide a correct answer. Not just formal calculation. OK?
      Eng. Matei Romeo Marian,
      27.08.2022

  • @user-iv9sz8dx1g
    @user-iv9sz8dx1g 8 місяців тому +1

    This is the best video about it ever....really thank you

  • @btf_flotsam478
    @btf_flotsam478 Рік тому

    I once took a series of lectures from a respected lecturer at a respectable institution on this topic, and this has explained it far better than the lecturer.

  • @yt2979a
    @yt2979a Рік тому +4

    Really great video!
    I wonder what you would get by applying the Fourier transform on the half integral?

  • @Quargos
    @Quargos Рік тому +19

    I'm a little disappointed that you didn't also show what happens if you tweak the bounds for fractional derivitives of sine / cosine.
    From what I've seen, including the weird behaviour, I'd expect that it should still just be a horizontal shift between the two, but it would have been nice to see that being the case.

    • @chiaracoetzee
      @chiaracoetzee Рік тому

      I mean, you can't move the bounds for sine to negative infinity since the integral becomes divergent, right? So it seems unavoidable. Unless maybe you take a limit of the whole thing as lower bound approaches negative infinity?

    • @Theimtheimtheim
      @Theimtheimtheim Рік тому

      @@chiaracoetzee no it won't, as long as p

  • @chriskiwi9833
    @chriskiwi9833 Рік тому +1

    Unexpectedly interesting and rich with mystery and promise.

  • @dwipf1851
    @dwipf1851 Рік тому +2

    A -1/2 integration "semiintegration" is used somewhat often in electrochemical analysis of current-voltage curves (voltammograms). The idea is to use the semiintegral transform to remove the time dependence of diffusional mass transport to a planar electrode. Thus allowing simplified analysis of effects of adsorption, electrolyte resistance, electrode kinetics, etc. See "Semiintegral electroanalysis. Theory and verification, Morten Grenness and Keith B. Oldham, Analytical Chemistry 1972 44 (7), 1121-1129 DOI: 10.1021/ac60315a037.
    I used the method myself during my PhD research.

  • @jonassattler4489
    @jonassattler4489 Рік тому +3

    Nice Video. Interestingly these Fractional derivatives also have some important applications (not that they aren't interesting on their own).
    There is an important result in (mathematical) Computer Tomography, which relies on them to (roughly speaking) show that the exact analytical solution for some Computer Tomography problem is actually not very good, because it amplifies errors in the source data. (The connection to the Fourier Transform is very important there)
    I think it is very interesting how these seemingly highly abstract concepts can actually be very important outside of mathematics itself.

  • @jeanjordaan9088
    @jeanjordaan9088 Рік тому +4

    Something about the visualization seems to remind me deeply of the Jacobian matrix (for change of basis integration) combined with the convolution integal.
    Fascinating topic though. Good luck for SoME2!

  • @mathgeek420
    @mathgeek420 Рік тому +1

    Very nice visualizations and presentation! I liked it a lot.

  • @res0nanc320
    @res0nanc320 Рік тому +1

    This is excellent. You deserve to win!

  • @feynstein1004
    @feynstein1004 Рік тому +5

    Damn, I love math 😊 and I kind of had a brainstorm at 4:18. All continuous functions can be integrated but not all continuous functions can be differentiated. I think that's because integrals are just geometric/algebraic manifestations of the arithmetic process of multiplication whereas derivatives are so for division. And as we know, any number can be multiplied with any other number. However, the same is not true for division (since you can't divide by zero). This asymmetry between multiplication and division directly results in the asymmetry between differentiability and integrability imo.

  • @bramilan
    @bramilan Рік тому +5

    Good lord!
    As a non mathematician, I've been asking myself this question for more than 15 years. Well, not every day, but still...
    Couldn't find anything more than the obvious polynomials with the gamma function.
    Now you solved my question.
    Didn't get all the details cause I was walking on the street while listening to it, but thank you!

  • @brendankane1879
    @brendankane1879 Рік тому

    So interesting - I know nothing about any of this but you make it fascinating - I bet you could save the world with this stuff.

  • @NoNTr1v1aL
    @NoNTr1v1aL Рік тому +1

    Absolutely amazing video! Subscribed.

  • @joellleoj
    @joellleoj Рік тому +3

    If you take an ordinary integral of a discontinuous function you get a continuous but not smooth function. What if you take the 1/2 integral of a discontinuous function? Is it discontinuous, continuous, or "half continuous"? "How much integral" do you need to make a discontinuous function continuous? 1 (whole) integral?, 0 + epsilon integral?, some "amount of integral" bounded away from 0 or 1? Similar questions for 3/2 integrals and smoothness.

  • @MiroslawHorbal
    @MiroslawHorbal Рік тому +5

    This summer of math has been excellent. Thank you for this, I think you explained the concept pretty clearly ☺️

  • @mobilephil244
    @mobilephil244 Рік тому

    Brilliant explanation. Thanks. Yes, please, more more more.

  • @richardnineteenfortyone7542
    @richardnineteenfortyone7542 Рік тому +2

    Like Sqrt(s) in Laplace transforms. Good stuff!

  • @gbeziuk
    @gbeziuk Рік тому +3

    It feels very natural to say that fractional derivatives have a huge quantum mechanics smell: they become simple and local when they _collapse_ into rare stable configurations where the nonlocality of the partials is cancelled out. I guess, it's the key to making sense of 'em.

    • @Pystro
      @Pystro Рік тому +3

      I was thinking along the same lines. Maybe some quantum mechanical concepts can be formulated in terms of fractional derivagrals.
      I mean, half-spin particles need to be rotated by 360° *twice* to return to their original state. That kind of seems like half derivagrals could be involved.

    • @gbeziuk
      @gbeziuk Рік тому +2

      @@Pystro you might also like Geometric (Grassman/Clifford/Hestenes) Algebra. It kinda has the same quantum/relativity smell.

  • @WhiterockFTP
    @WhiterockFTP Рік тому +4

    very cool - one thing i‘d like to add: please make the footnotes that sometimes appear on the bottom right stay for longer- it‘s very hard to catch it, i.e. press pause to read it :)

    • @morphocular
      @morphocular  Рік тому

      Ok, will do! Thanks for the feedback

    • @WhiterockFTP
      @WhiterockFTP Рік тому

      @@morphocular nice thank you! I think 1-2 seconds should already be enough to press pause.

  • @armanko95
    @armanko95 Рік тому

    Really nice video! What I did in order to get the coefficient right at 2:54 is to think of the more general formula d^m/dx^m (x^n), which is n!/(n-m)! x^(n-m). I love how looking things in a more generalized way helps with defining other notions of concepts.
    With the factorials and the halves and stuff, no wonder that the Gamma function pops out.

  • @danielgrieser6668
    @danielgrieser6668 Рік тому +2

    Very nice video, thanks. Here is a slightly different perspective on this: The formula for the n-fold integral (say with a=0) is really a convolution. More precisely, let chi^p(t) = t^p / Gamma(p+1) for positive t, and = 0 for negative t. Then the n-fold integral of f is just chi^(n-1) * f (where * denotes convolution of two functions). Now the formula for chi^p makes sense not just for natural numbers p but for any real number p > -1 (and even for any complex number whose real part is > -1) -- this condition ensures that the Gamma function is defined and that the integral converges, as explained in the video. So this defines fractional integrals.
    To get fractional derivatives one can proceed as in the video, or take a perspective from distribution theory (here I mean the distributions = generalized functions from analysis, not those from probability theory or geometry!). Namely, the chi^p form a holomorphic family of distributions in the complex half plane where the real part of p is > -1, and this family can be extended analytically to the whole complex plane (see below).
    Then it turns out that, for example, chi^(-1) = delta and chi^(-2) = delta' (where delta is the Dirac delta distribution). Now delta * f = f and delta' * f = f'. The latter formula shows that the first derivative of f is given by convolution with chi^(-2). So it makes sense to call chi^(-1.5) * f the half-derivative of f. And similarly for any complex number q you could call chi^(-1-q) * f the q'th derivative of f.
    This sounds quite different from what is done in the video but is actually the same in disguise (or maybe without the cloak, since it makes the deeper underlying concepts -- like convolutions and distributions -- clear). This becomes clear if one takes a peek under hood of how the analytic continuation of chi^p is done: First, observe that chi^p = (chi^(p+1))' for Re p > -1. Then use this formula to define chi^p for the larger set where Re p > -2 -- this works since then Re (p+1) > -1, so chi^(p+1) is defined, and any distribution can be differentiated! Now that you have defined chi^p for all p with Re p > -2, you can use the same trick to define at for Re p > -3 etc.
    (If this looks like magic to you -- it is; the same trick is actually used in other places, like the meromorphic continuation of the Gamma function.)
    If you do this for p = -1.5, you get precisely the definition of the half-derivative as given in the video.
    A reference for these things is the book by Hörmander: The Analysis of Linear Partial Differential Operators, I, Section 3.2 (not easy reading but quite ingenious if you get at it).

  • @lightningblender
    @lightningblender Рік тому +24

    Some thoughts:
    If gamma approaches infinity and the integral does too, then does there exist a limiting process such that those things „cancel“ and the derivegral appears…
    When fractional integrals are computed using convolution with a polynomial, is there an analogy in signal processing to make sense of it?

    • @alpers.2123
      @alpers.2123 Рік тому +1

      In laplace domain, fractional derivatives correspond to polynomials with fractional power in s plane. It can be approximated with infinite sum of integer powers. But im not sure how this can be interpreted either

  • @eliteteamkiller319
    @eliteteamkiller319 Рік тому +3

    Math gets truly interesting when you get to upper division and graduate stuff.

    • @navjotsingh2251
      @navjotsingh2251 Рік тому

      It really does, fractional calculus and Galois theory are two graduate level topics I’m learning and love them both.

  • @roberttelarket4934
    @roberttelarket4934 11 місяців тому

    Absolute ingenious idea a fractional derivative I had never even conceived of!!!!!

  • @toumanisidibe3602
    @toumanisidibe3602 Рік тому

    Very comprehensive video. Loved it.

  • @p07a
    @p07a Рік тому +3

    Would the fractional derivative for sine be truly just time shift if the lower bound was -inf instead of 0 in the video?

  • @MrRenanwill
    @MrRenanwill Рік тому +3

    Not being a local concept make It a mess, at least for me.

  • @alizabethfoster7410
    @alizabethfoster7410 Рік тому

    Wow! I never even thought of this, but it’s so fascinating. I kind of wish there was a website that would let me play around with fractional calculus on different functions.

  • @benheideveld4617
    @benheideveld4617 Рік тому

    A triumph!! One of the very best math video’s I have ever seen.

  • @xenontesla122
    @xenontesla122 Рік тому +11

    Very interesting! I wonder if this is equivalent to taking the inverse Laplace transform of (s^n)*F(s) where n is 1/2 for a 'half derivative'. It should obey the same rules where multiplying s^1/2 twice gives you the derivative… if the function starts at zero. I have no idea how the constant terms or n>1 would work. :/

  • @asnierkishcowboy
    @asnierkishcowboy Рік тому +6

    I remember attending an algebra seminar on local cohomology and a book by Lyubeznik, where some dude introduced the Weyl algebra (of some differential operators). I think that this thing can somehow be extend/ or falls in to the class of some algebraic structure, which kinda admits "half derivatives" in the sense that you can consider lets say S as a ring extension of a ring R and simply allow an element d from R to ramify in S with order 2. Of course d is a differantial operator.

  • @dassbah5316
    @dassbah5316 Рік тому

    Awesome Video!
    I would love to see a follow-up regarding dimensional regularization.
    A tool often used for quantum-field-theories, where you replace the dimension d=4 integral over spacetime with a complex d=(4 - i epsilon) and later take the limit of epsilon to 0.

  • @mikealche5778
    @mikealche5778 Рік тому +1

    This is incredibly amazing, thank you very very much!!!