Steepest Descent Method

Поділитися
Вставка
  • Опубліковано 23 жов 2024

КОМЕНТАРІ • 102

  • @DrHarishGarg
    @DrHarishGarg  3 роки тому

    See the MATLAB Code of Steepest Descent Method (This theory lecture)
    ua-cam.com/video/JfREfGtFTLA/v-deo.html

  • @nishantbindhani9434
    @nishantbindhani9434 Рік тому +29

    10:55 S1 is [-1]
    [1]

  • @mdtarifraihan3404
    @mdtarifraihan3404 Рік тому +12

    Dear Dr. Garg, At first thanks for the video. I am a PhD student in USA. I solved steepest descent method by the way you showed and got an acceptabpe result. But my professor did not gave me mark and he is asking to give him some pdfs or links or proofs which support this formula, specially the formula to get lambda you showed. I could not find exactly same formula supporting pds in internet. Can you please help me regarding this ?

  • @dinusebastian9384
    @dinusebastian9384 5 місяців тому +4

    s1=[-1,1]right,but u put [1,1]
    how it comes!?

  • @MohammedAhsan11
    @MohammedAhsan11 10 місяців тому +3

    thanks,final exam in 4 hrs. very helpful❤

  • @StreetandLeaf
    @StreetandLeaf 2 роки тому +1

    The way you are explaining is amazing. Voice is soft.

  • @zjschrage
    @zjschrage 4 роки тому +4

    Very clear video, the method is excellently explained, the logic is good and the example is also good.

  • @mehdisiyahi9849
    @mehdisiyahi9849 2 роки тому +1

    hi, dear professor. your teaching is very eloquent and instructive. Thanks a lot.

  • @hat3lif3
    @hat3lif3 2 роки тому

    while running this code why i am getting this error Error in ==> gradient at 59
    g = zeros(size(f),class(f)); % case of singleton dimension
    Error in ==> Untitled at 5
    grad = gradient(func);s
    Help needed....

  • @dr.k.anitha3283
    @dr.k.anitha3283 3 місяці тому

    Sir I need Secant method for optimization problem.Kindly provide it

  • @nawaab9275
    @nawaab9275 3 роки тому +4

    sir lambda ke formula me S1 ki value [-1 1] honi chahiye thi naa

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

    • @hat3lif3
      @hat3lif3 2 роки тому +1

      yes .....even if you take -1 1 answer same hi araha hain

  • @BeingAplomb
    @BeingAplomb 10 місяців тому +1

    Lamda 1 is 1 I there was some problem with transpose

  • @hat3lif3
    @hat3lif3 2 роки тому +1

    can we find Lamda of such function using same method f(x, y) = (x^2 + y - 11)^2 + (x + y^2 - 7)^2 since H matrix is not numeric in this case....

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому +1

      No.... As this function is not quadratic... For such function.. find X1 = X0+lambda S and hence find f(X1). Based on this f(X1), differential this f with lambda i.e.,, df/d(lambda) = 0 and find lambda.... I hope it clears now..

    • @hat3lif3
      @hat3lif3 2 роки тому

      @@DrHarishGarg i was not expecting such a quick reply....thanks alot sir ....You really doing a great job and making students life easy.......really appreciated......

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому

      My pleasure always... Keep watching and sharing the videos with other students too, so that they can also learn easily... Thank

  • @strangercomrade8709
    @strangercomrade8709 Рік тому

    Sir app ne jo iteration 1 me lebda ka value nikale h woo uper me s transpose or s1 ko matrix se solve kiye toh kya 2 aayega kya ek baar check kr ke batayie ga ki sahi h ya galat mere hisaab se galat h baaki appka concept ek dum jhakaas h

  • @zjschrage
    @zjschrage 4 роки тому +1

    I have one question, what happens if you wanted to use the next term in the taylor series at 7:42. The gradient represents the first order derivative, and the hessian the second order, but how would you do the third order one? And what about the f(delta X) ^ n where n is larger than 2. When n is 2 for example, we did the transpose of x times x (with the hessian in the middle because otherwise the matrix multiplication wouldn't work), but how would a third x be multiplied?

    • @DrHarishGarg
      @DrHarishGarg  4 роки тому +1

      For quadratic function, the third term (corresponding to third derivative) always zero... However, for non-quadratic function, you can find value of next point in terms of lambda and then find f(Newpoint). Then take the derivation of function with lambda and find lambda (according to the condition of maximum or minimum).
      I hope it will clear you.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому +1

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @zjschrage
      @zjschrage 3 роки тому

      @@DrHarishGarg Thanks!

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @ashiqueali4775
    @ashiqueali4775 Рік тому

    If in this question step size is given as 0.5 ...what it means? Is it the value of lamda?

  • @hamzaabuabed7584
    @hamzaabuabed7584 2 роки тому

    Thank you for your help

  • @suyan3093
    @suyan3093 3 роки тому +1

    Thank you sir! Many of the lecture are super helpful!

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому +1

      Glad to hear that.... Keep watching

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @muhammadsaqibshah7891
    @muhammadsaqibshah7891 3 роки тому +1

    wow!
    if you want to understand the topic, listen to the end.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @spickeris1632
    @spickeris1632 3 роки тому

    Hello sir!
    Can I find lambda value by using this: argmin(Xi-λ*▽f(Xi)), and λ>=0?

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому +1

      Yes .. You can .. But it will take alot of computational steps/time

  • @chaiturockstar9394
    @chaiturockstar9394 3 роки тому

    Great, Thanks for lecture

  • @mahendranandi5622
    @mahendranandi5622 3 роки тому +1

    Extremely helpful . Thanks a lot sir. And sir, here you have used analytical method ( to determine lambda) and didn't use other methods mentioned ( like newton, secant, etc which are perhaps only used to calculate optimum lambda ) are these methods are called.. exact or inexact line search. I mean I am confused about the methods.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Newton, secant methods are used to find the approximate value of lambda... Since there is a quadratic function so you can easily find value of lambda using analytical method to get exact answer....

    • @mahendranandi5622
      @mahendranandi5622 3 роки тому +1

      @@DrHarishGarg okk. As there in the example you took quadratic function that's why you went for exact value of lambda( and if we go with newton secant etc ,the inexact ones ..we will get a approximate value lambda ..so we may need more iterations than 6..[ here we got optimal value with in max 6 iters]..)
      And sir, are those ( newton..secant..quasinewton..) methods present in your playlist. I can't fine though.
      Thanks sir. With Respect❤️.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Yes they are also available.... See playlist "MATLAB code Numerical Methods"

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @grinfacelaxu
    @grinfacelaxu 4 місяці тому

    ThankYou!

  • @ricardosubiabres
    @ricardosubiabres 3 роки тому +1

    hello, in which book and chapter can I find the equations shown in the video? Thank you so much

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому +1

      Exactly i dont know the book...because i used it from my experience in teaching.. but you can see the book link given in the description of the video.
      Thanks for watching

    • @ricardosubiabres
      @ricardosubiabres 3 роки тому

      @@DrHarishGarg Thank you so much :)

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @prajjalmukherjee1695
    @prajjalmukherjee1695 11 місяців тому

    If the Hessian matrix contain x and y term what should I do?

    • @DrHarishGarg
      @DrHarishGarg  11 місяців тому +1

      Then substitute the value of x and y (critical point value) in hessian matrix... Already explain in Hessian matrix lecture...you may watch that lecture too

  • @sssskhan3
    @sssskhan3 4 роки тому

    Thank you Sir for this clear and concise vedio.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @maturdarah7243
    @maturdarah7243 3 роки тому

    Very elaborate video

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Glad you like it!.... Keep watching

  • @arindamchak
    @arindamchak 4 роки тому +7

    S1^T . S1 in (Step 2) S1 value is wrong. it should be [-1 not [1
    1] 1]

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @jiyandogan1685
    @jiyandogan1685 3 роки тому +1

    Thank you sir.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Welcome

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @abhishekchoudhary2294
    @abhishekchoudhary2294 3 роки тому

    Konsi book se liya hai content??

  • @kumarjarajapu1605
    @kumarjarajapu1605 2 роки тому

    Sir , i can't find "Univariate method" and "Powell's method". Could u please drop the link ?

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому

      Univariate methods are Golden section, Fibonacci search ... Both are available... See from the playlist
      NonLinear Programming Techniques: ua-cam.com/play/PLO-6jspot8AKg6Pov9fDHd3ys5_JlyUXv.html

    • @kumarjarajapu1605
      @kumarjarajapu1605 2 роки тому

      @@DrHarishGarg sir , i can't find Powell's method

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому +1

      Powell method is not explained till date.

    • @kumarjarajapu1605
      @kumarjarajapu1605 2 роки тому

      @@DrHarishGarg oh ok thanks. Btw your lectures are awesome.👌

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому +1

      Thanks... Keep watching and sharing with others too

  • @akanksha0143
    @akanksha0143 4 роки тому

    Thank you Sir. This lecture is very helpful.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      You are most welcome

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @SalihAbi1
    @SalihAbi1 2 роки тому

    Thank you! Very clear lecture.

  • @CodeNinjaByte
    @CodeNinjaByte Рік тому

    Thank you for this:)

  • @cvismenu
    @cvismenu 4 роки тому +1

    Thank you

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @jigarapatel762
    @jigarapatel762 2 роки тому

    sir i m in 2nd sem of m tech production an you please provide the solution of classical optimization

  • @shubhambhasin1568
    @shubhambhasin1568 4 роки тому

    Pls make a video on Quasi newton method also.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому +1

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @raginigupta2319
    @raginigupta2319 2 роки тому

    sir plz share the answers of the practice questions so that we can check our answers .regads

    • @DrHarishGarg
      @DrHarishGarg  2 роки тому

      Sure..... I will... In the meanwhile, you can watch the MATLAB Code of Steepest descent method and run the problem to verify your answers step by step...

  • @deepaksahoo5188
    @deepaksahoo5188 3 роки тому

    Thank u sir

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      My pleasure.... Keep watching other videos too

  • @ManzoorKhan-kk6qk
    @ManzoorKhan-kk6qk 3 роки тому

    Great! Appreciated

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Thanks .... My pleasure. Keep watching other content too and share with others.

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @gabbyf2906
    @gabbyf2906 2 роки тому

    how do you do the hessian matrix in iteration 2?

    • @merigonmeri
      @merigonmeri 4 місяці тому

      Basically just double differentiate it
      So basically
      for d/dx1^2 you have to differentiate with respect to x1 two times
      For dx2^2 with respect to x2 two times
      Like that for dx1dx2 first dx1 then dx2 differentiate
      And for dx2dx1 the reverse basically

  • @StreetandLeaf
    @StreetandLeaf 2 роки тому

    I have prepared for my exam. Now i am ready for exam.

  • @nawaab9275
    @nawaab9275 3 роки тому

    how u got gradient

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Partial derivative of the function with respect to the variables

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @tanmaygarg3757
    @tanmaygarg3757 7 місяців тому

    U should watch kk sir ❤

  • @roshankarki2456
    @roshankarki2456 2 роки тому

    ❤️❤️

  • @kelzangmpee8496
    @kelzangmpee8496 3 роки тому

    🙏🙏🙏

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      See the quadratic form lecture... New lecture uploaded
      ua-cam.com/video/6jjTLDX_JOk/v-deo.html

    • @DrHarishGarg
      @DrHarishGarg  3 роки тому

      Watch the Matlab code of this steepest descent method ... It is uploaded now

  • @serpilbozdag7225
    @serpilbozdag7225 2 роки тому

    Türkçe dil yokmu