Examples :Using Derivatives of ßo & ß1 to Minimize MSE | Linear Regression #5 | Machine Learning

Поділитися
Вставка
  • Опубліковано 22 сер 2024
  • In this video i've discussed about the gradient descent
    The Chain Rule | Nancy Pi : • The Chain Rule... How?...
    Other parts
    Part 1 : • Linear Regression for ...
    Part 2 : • Find the Best Fit Line...
    Part 3 : • Gradient Descent | Lin...
    Part 4 : • Using Derivatives of ß...
    Gradient descent is nothing but a iterative Optimization algorithm for finding minimum of a function.
    Talked about how to use the derivatives of ßetas for minimizing the cost function.
    Using Caculus for minimizing the derivatives.
    X : / anandrishv
    Join the Discord for any doubt and you'll find more enthusiast and amazing Like Minded people there
    Discord : / discord

КОМЕНТАРІ •