MSE101 L7.2 Non-linear least squares minimisation

Поділитися
Вставка
  • Опубліковано 5 вер 2024

КОМЕНТАРІ • 16

  • @SHONSL
    @SHONSL 3 роки тому +1

    good god thank you for using proper mic and audio equipment. it makes following MUCH easier for us.

  • @zacharythatcher7328
    @zacharythatcher7328 4 роки тому

    This is the first hint I’ve seen that utilizing the hessian matrix (matrix representing the curvature of a vector valued function) is a way to judge the value of the constant that you should use when minimizing the cost function. Thanks for that.

  • @UseLogicPlease123
    @UseLogicPlease123 8 років тому +1

    Many thanks for the video. Great presentation and all. Would be cool to see if you build all the way up to the more complex quicker solving methods.

  • @obewanjacobi
    @obewanjacobi 4 роки тому

    Solid video, very helpful refresher for me

  • @absolute___zero
    @absolute___zero 4 роки тому +1

    What I can't understand in all these algorithms that convert the *Error* into a *Error Squared* function is: why are you doing that? Because by converting the Error into ErrorSquared you are creating another level of complexity!!!, and then you have to derive the ErrorSquared so you can do the minimization. Why to elevate complexity into higher degree and then lower that degree immediately in the next step? Cant we just derive the error function as it is in its original form? That seems like the obvious thing to do, and you would go straight to the minimum without any wiggling. And then if you do second derivative on that , you would converge to the minimum even faster, that would be the equivalent of the third order derivative on the ErrorSquared function, so, second order derivative on Error (not ErrorSquared) would be much powerful method.

    • @pedroparamodelvalle6751
      @pedroparamodelvalle6751 4 роки тому +1

      The Error squared function is convex, hopefulyy positive definite also smoother. In simple terms we want to penalize either positive or negative error the same. You can also use absolute error but that is definitely not a smooth function.

    • @zacharythatcher7328
      @zacharythatcher7328 4 роки тому +1

      You have to start with the understanding that this calculation is actually getting the squared distance between the observations and the estimating function. The Pythagorean theorem gives you distance. This is just distance squared. And when you think about it, distance is really what you want to minimize. Simply adding non-squared error is not even giving you an analogue of distance anymore. It would be giving you some weird measure that might actually have an incorrect minimum somewhere that is not representative of minimizing the distance when you get into higher dimensional space. Luckily, the distance equation generalized into higher dimensional space, so we know that we are continuing to minimize distance even though we have no notion of what distance really looks like in 4-D.

  • @apprentice2101
    @apprentice2101 7 років тому +1

    Thank you so much!! You are really awesome at explaining math!

  • @skylogic
    @skylogic 6 років тому +1

    Great lesson! Though it'd be nice if you didn't block the camera's view of the board while writing and explaining the equations :)

  • @jogomez1988a
    @jogomez1988a 3 роки тому

    ¿Se podría aplicar este mismo método para una regresión sigmoidal? algún libro que pueda recomendar.

  • @manushanmugam
    @manushanmugam 8 років тому

    very nice lecture. Thank you

  • @salihuibrahim3853
    @salihuibrahim3853 6 років тому

    HI, please how can I fit my curve using Levenberg-Marquardt Method using curveExpert software?

  • @adrian-ng
    @adrian-ng 7 років тому +1

    That is the nicest whiteboard I have ever seen

    • @DavidDyeIC
      @DavidDyeIC  7 років тому

      Thanks Adrian! Looking forward to supporting learners with you on Coursera!

  • @miguelangeldiazsanchez5337
    @miguelangeldiazsanchez5337 8 років тому

    Great!! very useful!

  • @hsugoodman4223
    @hsugoodman4223 5 років тому

    讲的真好