In case it is helpful, I have all my 'Vector Differential Calculus' videos in a single playlist at ua-cam.com/play/PLxdnSsBqCrrF92Lr54CSJkSlpNWPTpkel.html. Please let me know what you think in the comments. Thanks for watching!
AA516: This is what I really need, since I haven't learned calculus formally before. This video was helpful for me to organize my thought again of what the derivative is.
AE 512: Wow very interesting lecture! Was not aware of the symmetric derivative and have been using the Newton quotient for most of my work, glad to now know this method.
AE 501 Partial Derivatives is a complicated concept when it comes to understanding it from a numerical point of view. Thank you so much for making it easier for us.
I just realized the time you uploaded this video was when covid was rampaging across the globe. Hearing you cough while recording this indoor brings back so many memories. I hope you are doing well now. Your lecture was awesome by the way.
AA516: This is interesting. I intuitively used the Newton Quotient technique (without knowing it was even called 'Newton Quotient') to create my gradient function for the past homework problem to minimize a 2 dimensional cost function using really really small and identical pertubations for each dimension. I watched your gradient descent "Optimization 04" video to help me use my gradient function to minimize the cost function. However, I had to intuitively create and figure out how to create my gradient function using my past partial derivative knowledge. Although my gradient descent optimization was good and I was able to make it converge into a local minima, I just feel that I could have created a more efficient, more accurate, and overall a much much better gradient function had I watched this lecture before being presented a problem similar to the past homework07 Problem 1 Part B. Although because of that, I was able to pretty much skip most of this lecture video to key topics such as the Symmetric difference quotient and the possible varying value of the pertubation sensitivity dX. Other than that, thanks for the lecture Professor Lum.
AA516: Seeing how perturbations can play into partial derivatives and the Newton Quotient was super exciting. I can see how this can tie into control inputs and when building a Simulink model like we've been doing.
Hi Romeo, Thanks for the kind words, I'm glad you enjoyed the video. If you find these videos helpful, I hope you'll consider supporting the channel via Patreon at www.patreon.com/christopherwlum or via the 'Thanks' button underneath the video. Given your interest in this topic, I'd love to have you a as a Patron as I'm able to talk/interact personally with all Patrons. Thanks for watching! -Chris
AA 516: I found it interesting that when numerically calculating the gradient the selection of the pertubation distances are dependent on the behavior of the function and do not necessarily have to be the same for different variables
I'm glad it was helpful. There are other similar videos on the channel please feel free to check them out and let me know what you think in the comments. Thanks for watching!
Your coughing is bad, dude. But very nice explanation, you've got a new subscriber in me ;) And btw. you've got x1^2+x2 in the function of two variables after a derivation, it should be x1.
Thanks for the video. Very neatly explained. Have a question. What if i have a experimental data of F(x,y) , and don't have an analytical expression . Can you put some light on this.
This is good in theory but I know that the libraries I use only take steps in 1 direction to find the Jacobian. I use Mathcad and Python. Mathcad only has a few ways to minimize a function Python's scipy has more. I know if I am trying to minimize a function of 5 variables, they will take a step in one direction for each variable to determine which way to go then do a line search in that direction. Doing 2 functions for each variable, one in each direction doubles the evaluation time, duh.
In case it is helpful, I have all my 'Vector Differential Calculus' videos in a single playlist at ua-cam.com/play/PLxdnSsBqCrrF92Lr54CSJkSlpNWPTpkel.html. Please let me know what you think in the comments. Thanks for watching!
AA516: This is a good refresher on where some of these numerical schemes come from! I am taking a CFD class right now, so they are especially relevant
same lets hope we graduate heheh
AA516: This is what I really need, since I haven't learned calculus formally before. This video was helpful for me to organize my thought again of what the derivative is.
[AE 512] 19:16
Always appreciate that you put the different terms or nomenclature that might be used to reference equations or ideas
AE 512: Wow very interesting lecture! Was not aware of the symmetric derivative and have been using the Newton quotient for most of my work, glad to now know this method.
AE 501 Partial Derivatives is a complicated concept when it comes to understanding it from a numerical point of view. Thank you so much for making it easier for us.
AE512: I like these math heavy videos. A nice breather from strictly engineering problems and allows to expand the mind a little differently
I just realized the time you uploaded this video was when covid was rampaging across the globe. Hearing you cough while recording this indoor brings back so many memories. I hope you are doing well now. Your lecture was awesome by the way.
AA516: This is interesting. I intuitively used the Newton Quotient technique (without knowing it was even called 'Newton Quotient') to create my gradient function for the past homework problem to minimize a 2 dimensional cost function using really really small and identical pertubations for each dimension. I watched your gradient descent "Optimization 04" video to help me use my gradient function to minimize the cost function. However, I had to intuitively create and figure out how to create my gradient function using my past partial derivative knowledge. Although my gradient descent optimization was good and I was able to make it converge into a local minima, I just feel that I could have created a more efficient, more accurate, and overall a much much better gradient function had I watched this lecture before being presented a problem similar to the past homework07 Problem 1 Part B. Although because of that, I was able to pretty much skip most of this lecture video to key topics such as the Symmetric difference quotient and the possible varying value of the pertubation sensitivity dX. Other than that, thanks for the lecture Professor Lum.
AA516: A good refresher! I actually ended up using this in the previous homework haha
AE512: numerical methods for solving PDEs really have augmented humanity's ability to predict and understand complex systems.
AA516: Seeing how perturbations can play into partial derivatives and the Newton Quotient was super exciting. I can see how this can tie into control inputs and when building a Simulink model like we've been doing.
Thank you for this video. I've been trying to find a numerical approach for my the gradient of my error function, and I saw this. Nice explanation.
Great information! Thank you!
Hi Romeo,
Thanks for the kind words, I'm glad you enjoyed the video. If you find these videos helpful, I hope you'll consider supporting the channel via Patreon at www.patreon.com/christopherwlum or via the 'Thanks' button underneath the video. Given your interest in this topic, I'd love to have you a as a Patron as I'm able to talk/interact personally with all Patrons. Thanks for watching!
-Chris
AE 512: Great refresher.
AA 516: I found it interesting that when numerically calculating the gradient the selection of the pertubation distances are dependent on the behavior of the function and do not necessarily have to be the same for different variables
Thank you for this clip that makes me more interested in the control theorem.
AE 512: Good refresher!
AE 512: plots really helped with visualizing the step size reasoning
AA516 - Fantastic lecture, this was very straightforward
AA516: I liked the plots that described why the dx magnitude can make a big difference.
Jason-AE512: The video help me on solve one problem in the homework. Thanks professor.
Thank you!
Thank you very nice explainsion
I'm glad it was helpful. There are other similar videos on the channel please feel free to check them out and let me know what you think in the comments. Thanks for watching!
Your coughing is bad, dude. But very nice explanation, you've got a new subscriber in me ;) And btw. you've got x1^2+x2 in the function of two variables after a derivation, it should be x1.
Thanks for the video. Very neatly explained. Have a question. What if i have a experimental data of F(x,y) , and don't have an analytical expression . Can you put some light on this.
AA516: Allie S
A A 516: Ojasvi Kamboj
The gradient of x1 at x0 equal = 0.75 , I real don't know where the location of it ?
This is good in theory but I know that the libraries I use only take steps in 1 direction to find the Jacobian. I use Mathcad and Python. Mathcad only has a few ways to minimize a function Python's scipy has more. I know if I am trying to minimize a function of 5 variables, they will take a step in one direction for each variable to determine which way to go then do a line search in that direction. Doing 2 functions for each variable, one in each direction doubles the evaluation time, duh.
AE512: I've been neglecting that we can approximate a partial derivative by finding the slope!
AA516: I didnt know that our gradient descent method was the newton quotient :)
AA516
AA516:Po
(x1^+x1)*sin(x2)? AE512