Neural Ordinary Differential Equations

Поділитися
Вставка
  • Опубліковано 5 чер 2024
  • If you would like to see more videos like this please consider supporting me on Patreon - / andriydrozdyuk
    The PDF of the slides used in this presentation can be downloaded here: github.com/drozzy/Neural-Ordi...
    0:00 - Outline of the presentation
    0:38 - Some Cool Results
    2:12 - What is a Neural ODE? (Machine Learning Part)
    12:15 - Connection to Dynamical Systems
    14:26 - Dynamical Systems
    20:03 - Pendulum, Example of a Dynamical System
    23:22 - Adjoint Method
    28:45 - Adjoint Method Proof
    30:49 - Gradients w.r.t. theta
    32:40 - Complete Backprop Algorithm
    34:27 - Concluding Remarks
  • Наука та технологія

КОМЕНТАРІ • 19

  • @keb785
    @keb785 Місяць тому +1

    This is very helpful; I appreciate it as it provides a comprehensive review with detailed explanations

  • @shorray
    @shorray 3 роки тому +21

    Herr YEAH! i was fighting like for 2 Weeks with the adjoint method and nobody really explained like this in detail. Thanks a lot keep going!

    • @kodfkdleepd2876
      @kodfkdleepd2876 Рік тому

      Or maybe it just took you 2 weeks to get it and you just happen to be watching this video when it "clicked"?

  • @tanmayahmed4622
    @tanmayahmed4622 3 роки тому +9

    Thank You very much, Sir. This is by far most easy explanation of neural ODE.

  • @vishwajitkumarvishnu3878
    @vishwajitkumarvishnu3878 3 роки тому +6

    Best video/blog so far on neural ODEs

  • @mohamedmusa7149
    @mohamedmusa7149 2 роки тому

    Excellent exposition of the paper! Thank you.

  • @fbf3628
    @fbf3628 Рік тому

    This is a truly great explanation!

  • @jishnuak3000
    @jishnuak3000 Рік тому

    Thanks for explaining the proof, couldn't find it anywhere else

  • @siddharthshrivastava5823
    @siddharthshrivastava5823 2 роки тому +1

    Awesome explanation!!

  • @mswification
    @mswification Рік тому +1

    I agree with all the previous comments, this was a terrific explanation. I particularly appreciated that you included details of the proof of the adjoint method.

    • @Rjsipad
      @Rjsipad Рік тому

      could you explain the difference between lower case f and theta? I'm a bit confused as to how they are different

  • @hannes7218
    @hannes7218 10 місяців тому

    great explanation! :)

  • @bwan03
    @bwan03 3 роки тому +2

    Brilliant! You're really good at explaining I must say. Excellent job! May I please ask what you used for presentation and drawing equations Andriy?

    • @AndriyDrozdyuk
      @AndriyDrozdyuk  3 роки тому +1

      Thanks! I think it was GoodNotes with ipad screen recording and apple pencil. (I just cut the surrounding window borders in the final recording)

  • @francoisgauthier-clerc6413
    @francoisgauthier-clerc6413 3 роки тому

    Good job, very clear explanation !
    However, it's sad that you didn't introduce some implementation of the function f. How can we design and implement such continuous function ?

    • @AndriyDrozdyuk
      @AndriyDrozdyuk  3 роки тому +1

      Oh that function doesn't exist - that's just for explanation purposes. This is what ODE solver does basically.

  • @tenkunvan60
    @tenkunvan60 16 днів тому

    28:44, i think the backward equation of the adjoint method might be wrong and the integral term should be negative

  • @XupyachkaX
    @XupyachkaX Рік тому

    Здравствуйте, Андрей!
    Так всё-таки, не могу понять зачем гонять нейронки для дифуров?
    Дифуры строятся на законах, а нейронки это вроде как хитрые интерполяции которые работают только в узкой, натренированной области значений.
    Или нейронка может составить краткое дифуравенение по наблюдениям за динамической системой решая обратную задачу?

  • @danielschwegler5220
    @danielschwegler5220 Рік тому

    Thanks for the superb explanation!