Applied Optimization - Steepest Descent

Поділитися
Вставка
  • Опубліковано 16 лис 2024

КОМЕНТАРІ • 48

  • @psomolethra
    @psomolethra 5 років тому +25

    He said it

  • @gerardo8av
    @gerardo8av 4 роки тому +13

    Wow!
    I was never good in Maths. I’m a physician, and quite old, 55 years of age. And I could understand everything.
    Oh, and yes: it’s the COVID-19 lockdown motivation...
    Thank you so much. You can explain anything, I bet!

  • @KeesJansma7689
    @KeesJansma7689 5 років тому +11

    I didn't understand a thing in my textbook, but this is really clear! Thank you sir

  • @venumadhavrallapalli
    @venumadhavrallapalli 2 роки тому +1

    guess a starting point, look in thesearch direction(jacobian vector), and search along a 1D variable, repeat. Very clear explaination, thank you.

  • @quynhngadau5482
    @quynhngadau5482 3 місяці тому

    Brilliant explanation, thank you!

  • @letpieau1660
    @letpieau1660 4 роки тому

    Love your lecturing style! Ty so much

  • @omercix
    @omercix 4 роки тому

    I love you, finially finished my assignment with your help :)

  • @robertgawlik2674
    @robertgawlik2674 3 роки тому

    I'm so glad I found this video. Thank you very much.

  • @Furzgranate666
    @Furzgranate666 6 років тому +3

    This is quality Education!

  • @mwont
    @mwont 2 роки тому

    Amazing explanation. Thank you.

  • @sanjayksau
    @sanjayksau 2 роки тому +1

    Beautiful explanation. Is there any video using conjugate direction as well?

  • @krishnadas6832
    @krishnadas6832 3 роки тому

    That was beautiful. Thank you very much.

    • @purdueMET
      @purdueMET  3 роки тому +1

      Wow, thanks :-) When I originally made this video, I thought it might be too specialized to get many views. I'm very pleased to have been wrong.

  • @yihengliu34
    @yihengliu34 4 роки тому

    Brilliant professor, thank you!

  • @paolaalvarado5352
    @paolaalvarado5352 3 роки тому

    Thank you so much, your explanation was very clear!!!

  • @beyzabutun565
    @beyzabutun565 4 роки тому

    This was very helpful. Thank you so much!

  • @tiborcamargo5732
    @tiborcamargo5732 6 років тому

    Such a great video, congratulations.

  • @OsmanNal
    @OsmanNal 4 роки тому

    This was really good. Thank you!

  • @123XTSK
    @123XTSK 2 роки тому

    Excellent!

  • @forinternet9079
    @forinternet9079 Рік тому

    Thank you.

  • @rmttbj
    @rmttbj 5 років тому +4

    24:21. Could you provide some guidance (a link to an example would also be fine) as to how to reach d = 0.179 if we are to calculate the value manually? Thank you very much :)

    • @dailyenglishphrases461
      @dailyenglishphrases461 3 роки тому +3

      if you write the fd(d) function you gonna see that this function just depend on the "d" and you can find it's minimum by using any 1 dimensional search algorithm like bisection, golden search, or newton raphson etc..or it can be resolved analyticaly. (maybe you are no longer interested in but some others could be :) )

  • @collinsdon-pedro1085
    @collinsdon-pedro1085 Рік тому

    Wow!!!! Thank you

  • @一只小香瓜
    @一只小香瓜 5 років тому

    Very clear. Thanks

  • @monicabrasow5402
    @monicabrasow5402 5 років тому

    Great! You are a Genius!

  • @ZinzinsIA
    @ZinzinsIA 2 роки тому

    Very interesting but many things I don't understand. Often when considering the gradient, we consider it at a particular point.
    When we draw the gradient vector, it's a vector in the same space and coordinate system, but originating from the point where we calculated the gradient, not a vector coming from the origin of the coordinates space ?
    Then, If I understand gradient like derivative is a slope giving the direction of biggest change, I don't get the intuition on why the gradient lying on this slope is oriented towards the direction of steepest ascent. Does it have anything to do with basis orientation/direction ?
    I mean when we draw a slope and say it's the slope of the derivative at a particular point, it does not tell us if it is going up or down. I mean rate of change could be toward the decreasing side of the slope, so why do we say gradient always point towards steepest ascent

  • @velagasohith949
    @velagasohith949 2 роки тому

    What is basic difference between steepest descent method and Marquardt method

  • @elnursalmanov7054
    @elnursalmanov7054 5 років тому

    thank you very much for the video

  • @robothegreatful
    @robothegreatful 5 років тому

    Great! Thank you!

  • @DouglasHPlumb
    @DouglasHPlumb 6 років тому +1

    That "d" is what brought me here - so what are the methods to find it other than getting another single dim minimization problem?

    • @DouglasHPlumb
      @DouglasHPlumb 6 років тому

      There is finding that ortho vector along the line..

  • @brandonrobertson6586
    @brandonrobertson6586 3 роки тому

    "Boats Boats Boats" - Laura Bell Bundy

  • @rowdyghatkar
    @rowdyghatkar 5 років тому

    The video was great... But what year is this... Did anyone else get some 90's vibes😀...

  • @minma02262
    @minma02262 3 місяці тому

    Why there is so many circles.

  • @chinmaypatil9386
    @chinmaypatil9386 5 років тому

    Great!

  • @BSplitt
    @BSplitt 6 років тому

    For these videos, can you please disable the clock in the background?

    • @WytseZ
      @WytseZ 6 років тому +5

      I didn't notice it untill I read your comment, now I can't even watch this video...

    • @9-mananshah741
      @9-mananshah741 5 років тому

      @@WytseZ 😂😂😂😂😂😂😂😂

  • @a.m.4654
    @a.m.4654 6 років тому

    Thank you, you helped me alot :D

  • @judkilolo
    @judkilolo 4 роки тому

    What is the unity of d ?

  • @1995a1995z
    @1995a1995z 5 років тому +18

    lol sorry to bring this up but it sounded like you said the n word in 10:53
    Great tutorial though. much appreciated

  • @Krautmaster86
    @Krautmaster86 4 роки тому

    i suggest to put a mic on ur tshirt =)