Numerically Calculating Partial Derivatives

Поділитися
Вставка
  • Опубліковано 8 лют 2025

КОМЕНТАРІ • 38

  • @ChristopherLum
    @ChristopherLum  4 роки тому +3

    In case it is helpful, I have all my 'Vector Differential Calculus' videos in a single playlist at ua-cam.com/play/PLxdnSsBqCrrF92Lr54CSJkSlpNWPTpkel.html. Please let me know what you think in the comments. Thanks for watching!

  • @milesrobertroane955
    @milesrobertroane955 11 місяців тому +1

    AA516: This is a good refresher on where some of these numerical schemes come from! I am taking a CFD class right now, so they are especially relevant

    • @ziongaming9093
      @ziongaming9093 9 місяців тому

      same lets hope we graduate heheh

  • @EomjiKim
    @EomjiKim 11 місяців тому +1

    AA516: This is what I really need, since I haven't learned calculus formally before. This video was helpful for me to organize my thought again of what the derivative is.

  • @timproby7624
    @timproby7624 8 місяців тому

    [AE 512] 19:16
    Always appreciate that you put the different terms or nomenclature that might be used to reference equations or ideas

  • @Gholdoian
    @Gholdoian 8 місяців тому +1

    AE 512: Wow very interesting lecture! Was not aware of the symmetric derivative and have been using the Newton quotient for most of my work, glad to now know this method.

  • @alviehaider4559
    @alviehaider4559 Рік тому

    AE 501 Partial Derivatives is a complicated concept when it comes to understanding it from a numerical point of view. Thank you so much for making it easier for us.

  • @chayweaver.2995
    @chayweaver.2995 8 місяців тому

    AE512: I like these math heavy videos. A nice breather from strictly engineering problems and allows to expand the mind a little differently

  • @s.a.7950
    @s.a.7950 2 роки тому

    I just realized the time you uploaded this video was when covid was rampaging across the globe. Hearing you cough while recording this indoor brings back so many memories. I hope you are doing well now. Your lecture was awesome by the way.

  • @rowellcastro2683
    @rowellcastro2683 11 місяців тому

    AA516: This is interesting. I intuitively used the Newton Quotient technique (without knowing it was even called 'Newton Quotient') to create my gradient function for the past homework problem to minimize a 2 dimensional cost function using really really small and identical pertubations for each dimension. I watched your gradient descent "Optimization 04" video to help me use my gradient function to minimize the cost function. However, I had to intuitively create and figure out how to create my gradient function using my past partial derivative knowledge. Although my gradient descent optimization was good and I was able to make it converge into a local minima, I just feel that I could have created a more efficient, more accurate, and overall a much much better gradient function had I watched this lecture before being presented a problem similar to the past homework07 Problem 1 Part B. Although because of that, I was able to pretty much skip most of this lecture video to key topics such as the Symmetric difference quotient and the possible varying value of the pertubation sensitivity dX. Other than that, thanks for the lecture Professor Lum.

  • @petermay6090
    @petermay6090 11 місяців тому

    AA516: A good refresher! I actually ended up using this in the previous homework haha

  • @bsgove
    @bsgove 8 місяців тому

    AE512: numerical methods for solving PDEs really have augmented humanity's ability to predict and understand complex systems.

  • @yaffetbedru6612
    @yaffetbedru6612 11 місяців тому

    AA516: Seeing how perturbations can play into partial derivatives and the Newton Quotient was super exciting. I can see how this can tie into control inputs and when building a Simulink model like we've been doing.

  • @lourdjohnjoaquin1559
    @lourdjohnjoaquin1559 4 роки тому

    Thank you for this video. I've been trying to find a numerical approach for my the gradient of my error function, and I saw this. Nice explanation.

  • @romeobarakeh4286
    @romeobarakeh4286 2 роки тому +1

    Great information! Thank you!

    • @ChristopherLum
      @ChristopherLum  2 роки тому +1

      Hi Romeo,
      Thanks for the kind words, I'm glad you enjoyed the video. If you find these videos helpful, I hope you'll consider supporting the channel via Patreon at www.patreon.com/christopherwlum or via the 'Thanks' button underneath the video. Given your interest in this topic, I'd love to have you a as a Patron as I'm able to talk/interact personally with all Patrons. Thanks for watching!
      -Chris

  • @edwardmau5877
    @edwardmau5877 8 місяців тому

    AE 512: Great refresher.

  • @milesbridges3547
    @milesbridges3547 Рік тому +2

    AA 516: I found it interesting that when numerically calculating the gradient the selection of the pertubation distances are dependent on the behavior of the function and do not necessarily have to be the same for different variables

  • @mizuniverseutube67
    @mizuniverseutube67 3 роки тому

    Thank you for this clip that makes me more interested in the control theorem.

  • @SayedTorak
    @SayedTorak 8 місяців тому

    AE 512: Good refresher!

  • @davidtelgen8114
    @davidtelgen8114 8 місяців тому +1

    AE 512: plots really helped with visualizing the step size reasoning

  • @boeing797screamliner
    @boeing797screamliner 4 роки тому +2

    AA516 - Fantastic lecture, this was very straightforward

  • @Colin_Baxter_UW
    @Colin_Baxter_UW 11 місяців тому

    AA516: I liked the plots that described why the dx magnitude can make a big difference.

  • @WalkingDeaDJ
    @WalkingDeaDJ 8 місяців тому

    Jason-AE512: The video help me on solve one problem in the homework. Thanks professor.

  • @pvasi5281
    @pvasi5281 3 роки тому

    Thank you!

  • @farisandjamilahalsubaie828
    @farisandjamilahalsubaie828 4 роки тому +1

    Thank you very nice explainsion

    • @ChristopherLum
      @ChristopherLum  4 роки тому

      I'm glad it was helpful. There are other similar videos on the channel please feel free to check them out and let me know what you think in the comments. Thanks for watching!

  • @RudolfKlusal
    @RudolfKlusal 4 роки тому +1

    Your coughing is bad, dude. But very nice explanation, you've got a new subscriber in me ;) And btw. you've got x1^2+x2 in the function of two variables after a derivation, it should be x1.

  • @manjunathd9324
    @manjunathd9324 4 роки тому

    Thanks for the video. Very neatly explained. Have a question. What if i have a experimental data of F(x,y) , and don't have an analytical expression . Can you put some light on this.

  • @AlexandraSurprise
    @AlexandraSurprise 11 місяців тому

    AA516: Allie S

  • @ojasvikamboj6083
    @ojasvikamboj6083 Рік тому

    A A 516: Ojasvi Kamboj

  • @farisandjamilahalsubaie828
    @farisandjamilahalsubaie828 4 роки тому

    The gradient of x1 at x0 equal = 0.75 , I real don't know where the location of it ?

  • @pnachtwey
    @pnachtwey 2 роки тому

    This is good in theory but I know that the libraries I use only take steps in 1 direction to find the Jacobian. I use Mathcad and Python. Mathcad only has a few ways to minimize a function Python's scipy has more. I know if I am trying to minimize a function of 5 variables, they will take a step in one direction for each variable to determine which way to go then do a line search in that direction. Doing 2 functions for each variable, one in each direction doubles the evaluation time, duh.

  • @aimeepak717
    @aimeepak717 8 місяців тому

    AE512: I've been neglecting that we can approximate a partial derivative by finding the slope!

  • @princekeoki4603
    @princekeoki4603 11 місяців тому

    AA516: I didnt know that our gradient descent method was the newton quotient :)

  • @daniellerogers5959
    @daniellerogers5959 Рік тому

    AA516

  • @Po-ChihHuang
    @Po-ChihHuang 11 місяців тому

    AA516:Po

  • @donabien-aime2324
    @donabien-aime2324 2 роки тому

    (x1^+x1)*sin(x2)? AE512