The best way to start off with this specific Nathan's work is to check out the 2017 paper: Data-driven discovery of partial differential equations . (along with Steven Brunton).
Excellent presentation of a fascinating subject. A question motivated by old fashioned econometrics. You mention the A matrix is "limited by imagination"; if you introduce two highly correlated variables in linear regression, OLS will have trouble distinguishing one from the other. Won't that be a problem here too?
What if I want to model non periodic functions like IF I want the Dynamics of a Fermentation process(Bio reactor) ? How is this different from fitting a neural network ?
It's an old question but I'll answer it anyway, in this case Prof. Kutz used the method called central difference, which is a better approximation of the derivative than forward difference, check the Taylor approximation of the terms to see this ;)
I am learning quite a bit from these videos
Interesting and Clear Explanation. Thanks for uploading and sharing.
Will you please categorize the subjects you have taught so we can start them with more focus? Thanks.
The best way to start off with this specific Nathan's work is to check out the 2017 paper: Data-driven discovery of partial differential equations
. (along with Steven Brunton).
you can see that in his website. he organize the subject well in there
Excellent presentation of a fascinating subject.
A question motivated by old fashioned econometrics. You mention the A matrix is "limited by imagination"; if you introduce two highly correlated variables in linear regression, OLS will have trouble distinguishing one from the other. Won't that be a problem here too?
What if I want to model non periodic functions like IF I want the Dynamics of a Fermentation process(Bio reactor) ? How is this different from fitting a neural network ?
This is amazing!
doesn't work with small parameters
In the code, shouldn't it be x1dot(j) instead of x1dot(j-1)?
It's an old question but I'll answer it anyway, in this case Prof. Kutz used the method called central difference, which is a better approximation of the derivative than forward difference, check the Taylor approximation of the terms to see this ;)
Thank you!
After u tried hard cleaning that board I saw a mark in the middle, thought that it is on my monitor and started rubbing it.
Interesting and Clear Explanation. Thanks for uploading and sharing.