Most inspiring video I ever seen. I got two takeaways: transferring none resolvable problem to an equivalent resolvable problem; gradient is a good way.
I really have to learn to try ideas and equations with simple examples. I was so afraid Lagrange multipliers and Lagrange equation and its sense that I just dropped it off. How lucky that I just saw with the corner of my eye that thumbnail on my recommendation list with a characteristic Brianish drawing style with the "Lagrangian" word within the title. I knew before watching that you will help as always. Gosh you are a great educator man.
Brian, can you do for us a summer school course for control engineers I'll be the first one to attend if it's you talking about the intuition behind control!
I appreciate it! MathWorks already has a Tech Talk series on MPC so I doubt I'll make one in the near future. ua-cam.com/play/PLn8PRpmsu08ozoeoXgxPSBKLyd4YEHww8.html. Perhaps one day when we revisit some of the older videos.
Great video. In the interest of being precise and thinking about what might trip up new learners, someone who's paying really close attention will find 2:45 confusing since you can't have " *thee* partial derivative with respect to both x_1 and x_2". Instead, the gradient is a vector of all of the partial derivativeS, plural, of f( *x* ), where the ith element of the gradient is the partial derivative of f with respect to the ith element of *x* Sorry for the pedantry, but from my own experience, the problem is that we often ask math students to pay close attention to exactly that kind of fine distinction in other contexts, so a description of the gradient that, taken literally, can't exist is likely to cause minor confusion for talented students. That said, phenomenal video. This would be very useful for teaching someone who has only a knack for scalar calculus one of the most important ideas in multivariable calculus quite efficiently.
had an undergrad professor so determined to stop cheaters that he only allowed scientific calculators which didn't bother me until he expected us to do regression
Thanks for this great video! 6:56 - I am a bit confused about interpreting the gradient of the constraint as it does not reflect the direction of maximum ascent of j(x) or c(x). So, how should I think about this?
Hello! It is pointing in the direction of the maximum ascent of c(x). The black line is when C(x) = 0. Every combination of x1 and x2 that are below that black line is negative, and every combination of them above the black line is positive. And therefore, if you are standing on the black line and you want to ascend the slope, you'd walk up and the to the right to increase the value of C(x).
I am confused about the slope obtained by differentiation. They are the slopes of dz/dx(i) but not the projection to the x-y plane. Thus, I cannot understand how it can be parallel? However, they are parallel if the "projections" slopes , ie. dx(2)/dx(1) is calculated and used. However, it is just 0 and were not used in the calculation.
The conclusion at 2:18 is wrongly constructed. Take x⁴ and you'll see that the second derivative at x=0 also evaluates to 0, while we know it has a minimum there. You ought to inspect an ɛ environment and conclude from that. Since in the video, f'' is an odd function, that is what makes you conclude that it's a saddle rather than an extremum.
You are a single piece, bro. You're explaining intuitions, makes me excited all the time.
I appreciate it!
my god, two weeks of lectures explained in one video. you are great man.
Most inspiring video I ever seen. I got two takeaways: transferring none resolvable problem to an equivalent resolvable problem; gradient is a good way.
“You’re not going to be solving it by hand.”
*laughs then cries in graduate student*
😂😭
Wish this was the way it was explained in university. Liked and subbed
Thanks!
I really have to learn to try ideas and equations with simple examples. I was so afraid Lagrange multipliers and Lagrange equation and its sense that I just dropped it off. How lucky that I just saw with the corner of my eye that thumbnail on my recommendation list with a characteristic Brianish drawing style with the "Lagrangian" word within the title. I knew before watching that you will help as always. Gosh you are a great educator man.
Brian, can you do for us a summer school course for control engineers I'll be the first one to attend if it's you talking about the intuition behind control!
Thanks Brian, I always look forward to new Tech Talks! Could you do a video on MPC? That would be awesome!
I appreciate it! MathWorks already has a Tech Talk series on MPC so I doubt I'll make one in the near future. ua-cam.com/play/PLn8PRpmsu08ozoeoXgxPSBKLyd4YEHww8.html. Perhaps one day when we revisit some of the older videos.
Great video. In the interest of being precise and thinking about what might trip up new learners, someone who's paying really close attention will find 2:45 confusing since you can't have " *thee* partial derivative with respect to both x_1 and x_2". Instead, the gradient is a vector of all of the partial derivativeS, plural, of f( *x* ), where the ith element of the gradient is the partial derivative of f with respect to the ith element of *x*
Sorry for the pedantry, but from my own experience, the problem is that we often ask math students to pay close attention to exactly that kind of fine distinction in other contexts, so a description of the gradient that, taken literally, can't exist is likely to cause minor confusion for talented students.
That said, phenomenal video. This would be very useful for teaching someone who has only a knack for scalar calculus one of the most important ideas in multivariable calculus quite efficiently.
Thanks for the clarification. I appreciate hearing this type of feedback because it helps me change the way I present future videos. Cheers!
had an undergrad professor so determined to stop cheaters that he only allowed scientific calculators which didn't bother me until he expected us to do regression
Nice video! Looking forward to the nonlinear constrained optimization part!
Hey, could you recommend any non linear constrained optimization videos?
Great as always! 🎉
Thanks!
Great teaching❤
Thanks!
Great video!
Thank you soo much. This video was very helpful.
Glad it was helpful!
Super intutive😊❤
Glad you liked it.
Thanks for this great video! 6:56 - I am a bit confused about interpreting the gradient of the constraint as it does not reflect the direction of maximum ascent of j(x) or c(x). So, how should I think about this?
Hello! It is pointing in the direction of the maximum ascent of c(x). The black line is when C(x) = 0. Every combination of x1 and x2 that are below that black line is negative, and every combination of them above the black line is positive. And therefore, if you are standing on the black line and you want to ascend the slope, you'd walk up and the to the right to increase the value of C(x).
I am confused about the slope obtained by differentiation. They are the slopes of dz/dx(i) but not the projection to the x-y plane. Thus, I cannot understand how it can be parallel?
However, they are parallel if the "projections" slopes , ie. dx(2)/dx(1) is calculated and used. However, it is just 0 and were not used in the calculation.
This is very helpful
Glad you like it!
5:45 the visual illusion make the dark line look curved .... XD
I love this
❤❤❤❤❤ 🎉
nice
Thanks for watching!
Can't see the video
It's working for me. What do you see?
@@HansScharler I just see a black screen
Did you get it figured out?
how do you type with eyes closed? :O
The conclusion at 2:18 is wrongly constructed. Take x⁴ and you'll see that the second derivative at x=0 also evaluates to 0, while we know it has a minimum there. You ought to inspect an ɛ environment and conclude from that. Since in the video, f'' is an odd function, that is what makes you conclude that it's a saddle rather than an extremum.