Neural Ordinary Differential Equations
Вставка
- Опубліковано 5 чер 2024
- If you would like to see more videos like this please consider supporting me on Patreon - / andriydrozdyuk
The PDF of the slides used in this presentation can be downloaded here: github.com/drozzy/Neural-Ordi...
0:00 - Outline of the presentation
0:38 - Some Cool Results
2:12 - What is a Neural ODE? (Machine Learning Part)
12:15 - Connection to Dynamical Systems
14:26 - Dynamical Systems
20:03 - Pendulum, Example of a Dynamical System
23:22 - Adjoint Method
28:45 - Adjoint Method Proof
30:49 - Gradients w.r.t. theta
32:40 - Complete Backprop Algorithm
34:27 - Concluding Remarks - Наука та технологія
This is very helpful; I appreciate it as it provides a comprehensive review with detailed explanations
Herr YEAH! i was fighting like for 2 Weeks with the adjoint method and nobody really explained like this in detail. Thanks a lot keep going!
Or maybe it just took you 2 weeks to get it and you just happen to be watching this video when it "clicked"?
Thank You very much, Sir. This is by far most easy explanation of neural ODE.
Best video/blog so far on neural ODEs
Excellent exposition of the paper! Thank you.
This is a truly great explanation!
Thanks for explaining the proof, couldn't find it anywhere else
Awesome explanation!!
I agree with all the previous comments, this was a terrific explanation. I particularly appreciated that you included details of the proof of the adjoint method.
could you explain the difference between lower case f and theta? I'm a bit confused as to how they are different
great explanation! :)
Brilliant! You're really good at explaining I must say. Excellent job! May I please ask what you used for presentation and drawing equations Andriy?
Thanks! I think it was GoodNotes with ipad screen recording and apple pencil. (I just cut the surrounding window borders in the final recording)
Good job, very clear explanation !
However, it's sad that you didn't introduce some implementation of the function f. How can we design and implement such continuous function ?
Oh that function doesn't exist - that's just for explanation purposes. This is what ODE solver does basically.
28:44, i think the backward equation of the adjoint method might be wrong and the integral term should be negative
Здравствуйте, Андрей!
Так всё-таки, не могу понять зачем гонять нейронки для дифуров?
Дифуры строятся на законах, а нейронки это вроде как хитрые интерполяции которые работают только в узкой, натренированной области значений.
Или нейронка может составить краткое дифуравенение по наблюдениям за динамической системой решая обратную задачу?
Thanks for the superb explanation!