Gradient Boosting Regression Part 2 | Mathematics of Gradient Boosting

Поділитися
Вставка
  • Опубліковано 20 січ 2025

КОМЕНТАРІ • 60

  • @yashshrivastava1612
    @yashshrivastava1612 2 роки тому +27

    After learning from you I have realized how so called experienced teachers are teaching wrong stuff in big institutions.
    You're doobte huye ka sahara.
    Thanks Nitish

  • @akashsaha6671
    @akashsaha6671 2 роки тому +47

    So called big youtubers and institutes of India have explained the topic completely wrong. I will not name him or them. Thank you for this explanation. I was stuck after reading boosting in wiki but the way you explainesd now everything is clear. Simply awesome.
    One request please create a video on XgBoost and please explain how regualrization is working in xgBoost.

    • @ankitbiswas8380
      @ankitbiswas8380 2 роки тому +20

      Khal-naik 🤣 ?

    • @Tusharchitrakar
      @Tusharchitrakar Рік тому +1

      ​@@ankitbiswas8380Bhai Saab 🤣 I'm so glad I came across this channel because I first visited the 'he who should not be named' ka channel and i got so confused. Nitesh Singh ki jai Bhai!!

  • @taslima5007
    @taslima5007 9 місяців тому +3

    I wonder why you are so underrated! you deserve more hype and subscribers

  • @AltafAnsari-tf9nl
    @AltafAnsari-tf9nl Рік тому +8

    No one has taught boosting simpler than u did . hats off to you and the amount of efforts you have put in explaining with graphs. This is the first time i understood additive modelling is sum of multiple smaller functions .

  • @mgopalakrishnan3439
    @mgopalakrishnan3439 2 місяці тому +2

    I’m a great fan of your content, sir, and I truly appreciate the value it brings to learners worldwide in the fields of Machine Learning and Deep Learning. I have a small request: as your videos are watched by people from various backgrounds, it would be incredibly helpful if more of the content were delivered in English. This would make it easier for a broader audience, like myself, to follow along and fully benefit from your teachings. I hope that, from your upcoming series on PyTorch and Generative AI, we might see more content in English. Thank you very much for considering this suggestion!

  • @talibdaryabi9434
    @talibdaryabi9434 Рік тому +10

    Additive modeling is a statistical technique for modeling complex relationships between variables by breaking them down into a sum of simpler relationships. The idea behind additive modeling is to add up simple functions of the predictors to model the response, rather than attempting to model the response as a complicated function of the predictors.
    An example of additive modeling is modeling the relationship between temperature, rainfall, and crop yield. Instead of trying to model the relationship between these variables as a single, complex equation, an additive model would break it down into three separate, simple relationships: the relationship between temperature and crop yield, the relationship between rainfall and crop yield, and the relationship between temperature and rainfall. These separate relationships can then be added together to give a final model of the relationship between temperature, rainfall, and crop yield.

  • @ParthivShah
    @ParthivShah 9 місяців тому +2

    Thank You Sir.

  • @zakircurrentaffairs858
    @zakircurrentaffairs858 Рік тому +2

    As said for boosting we take pseudo residuals which is y minus y hat not taking square of them. then why we do consider the loss function of least square function which is square of difference of actual and prediction in finding the calculation of terminal leaf vale in decision function.
    At the end as we have seen the calculated values for the end leaf of formula is similar to that of the coded decision tree which you should be differ in certain value. but in this case we got the exact same value.
    what could be the other loss function we can use here.
    by the way a great job, you made so easy to understand the math behind every machine learning algorithm. keep do continue to inspiring people.
    thank you so much.

    • @siddheshkalgaonkar2752
      @siddheshkalgaonkar2752 День тому

      he has clearly mentioned the reason of using LS loss function over other loss functions.

  • @shivamnegi1616
    @shivamnegi1616 4 місяці тому +1

    At 47:28, the derivative of f0(x) wrt Y (Gamma) is not zero. Can you please check it again. Since f0(x) is also depend on Y (Gamma).

  • @unpluggedmelodiesfromharsh2049

    Great video. Decoding each step and making it easy to understand. Best explanation. Thank you very much for such an amazing content on your channel🙌🙌

  • @shaelanderchauhan1963
    @shaelanderchauhan1963 2 роки тому +17

    XGBoost Regression and Classification Next topic please and LightGBm and CatBoost next

  • @somanathking4694
    @somanathking4694 Рік тому +1

    Sir, your work is awesome, i learnt almost machine learning algorithms from your channel. Thank you very much, if possible, please do the XG boost

  • @sarveshkesharwani1876
    @sarveshkesharwani1876 4 дні тому

    24:53 step 2. (a)
    41:54 step 2. (c)

  • @sohildoshi2655
    @sohildoshi2655 2 роки тому +2

    Bro you are great!! superb explanation

  • @sashank5328
    @sashank5328 9 місяців тому

    Very useful. Enjoying your videos. Great work

  • @pourabbhattacharjee2775
    @pourabbhattacharjee2775 4 місяці тому

    Is there no learning rate involved in the final formula of output of Gradient Boosting?

  • @dushyantbarot1988
    @dushyantbarot1988 6 місяців тому

    Thanks for the video. Which book are you refering to in the video: ISLR or Elements of Statistical Learning?

  • @KhairulMia-tr2jv
    @KhairulMia-tr2jv 3 місяці тому

    can you please provide the reasearch paper link?

  • @uddhavsangle2219
    @uddhavsangle2219 Рік тому +1

    great work sir 🤩

  • @KRChandan05
    @KRChandan05 2 роки тому +2

    Please make a video of Gradient Boosting for Calssification too. Thanks!

  • @shaelanderchauhan1963
    @shaelanderchauhan1963 2 роки тому +1

    Brother Very well explained!

  • @jitendrachouhan7024
    @jitendrachouhan7024 3 роки тому +3

    Bhai, I have created my Full application. But the problem is that I can't create Recommandation System in my App. When User open app they see same post always. Please create Post Recommandation System using tflite in java or kotlin 🙏

  • @SaifAbbas-c9p
    @SaifAbbas-c9p 3 місяці тому

    very nice sir , thanks

  • @chinmayasahoo5897
    @chinmayasahoo5897 2 роки тому +2

    sir please upload video of Gradient boosting on classification data

  • @SangitaBhakat-l4g
    @SangitaBhakat-l4g 7 місяців тому

    very informative video

  • @deepakkapoor5427
    @deepakkapoor5427 2 роки тому +4

    Please add xgboost and lightgbm (classification and regression)

  • @itsamankumar403
    @itsamankumar403 Рік тому +1

    Sir, TYSM for the lecture. Can you provide the code dataset ?

  • @umair2haider
    @umair2haider 5 місяців тому

    Excellent

  • @Akshay-dn7ni
    @Akshay-dn7ni Рік тому

    brother i cant find the video on gradient boosting classification

  • @aiforeveryone
    @aiforeveryone Рік тому

    Great job

  • @phen8318
    @phen8318 8 місяців тому

    Can anyone please tell me, what exactly is gamma in point c), i am having trouble understanding it

    • @Sam-nn3en
      @Sam-nn3en 2 місяці тому

      Gamma was yi hat (predicted value)

    • @Sam-nn3en
      @Sam-nn3en 2 місяці тому

      If you understand, step one properly step 2C is similar.

  • @bhushanbowlekar4539
    @bhushanbowlekar4539 Рік тому +2

    correction in the d) update formula there must be a learning rate we have to add

    • @Abhishek-qw6ny
      @Abhishek-qw6ny Рік тому

      that use in i think use larning rate in classification

    • @Sam-nn3en
      @Sam-nn3en 2 місяці тому

      @@Abhishek-qw6ny no learning rate is used when there is overfitting. Does not matter if it is regression or classification.

    • @Sam-nn3en
      @Sam-nn3en 2 місяці тому

      He did not use learning rate because the sample size was just for explanation(3) but yes, I think you are right. We do need to use learning rate

  • @sayantandas8739
    @sayantandas8739 4 місяці тому

    please disclose the name of the book from which the algorithm is taken.

    • @semifarzeon
      @semifarzeon 3 місяці тому +1

      Statistical Learning

    • @semifarzeon
      @semifarzeon 3 місяці тому +1

      Statistical Learning by Stanford. Please don't dive too much in this book, it's researcher level book with Advanced Mathematical notations and concepts. You'll get overwhelmed as a beginner just like me. ✨

  • @thatsfantastic313
    @thatsfantastic313 2 роки тому

    Sir Association or baki unsupervised or reinforcement learning pr b videos bnaye na

  • @sumitnautiyal9934
    @sumitnautiyal9934 3 роки тому +1

    Sir apko formula yaad rehte hai kya mujhe sirf working yaad rehti hai algo ki formule nhi kya yeah sahi hai ya nahi

  • @SalmanShaikh-jy2ee
    @SalmanShaikh-jy2ee 3 роки тому +2

    Sir please make video on SVM and Naive Bayes..

    • @campusx-official
      @campusx-official  3 роки тому +1

      Naive Bayes: ua-cam.com/play/PLKnIA16_RmvZ67wQaHoBuzXaDAfPz-a6l.html

  • @sandipansarkar9211
    @sandipansarkar9211 2 роки тому

    finished watching

  • @vrushaliramesh7328
    @vrushaliramesh7328 2 роки тому +1

    Please make a video on xgboost & adaboost

    • @campusx-official
      @campusx-official  2 роки тому +3

      Adaboost (Updated): ua-cam.com/play/PLKnIA16_RmvZxriy68dPZhorB8LXP1PY6.html

  • @bangarrajumuppidu8354
    @bangarrajumuppidu8354 3 роки тому

    sir please upload video on gradient boost classifier

  • @core4032
    @core4032 2 роки тому

    sir classification vala part baki hai ?

  • @akshaybaraskar6735
    @akshaybaraskar6735 2 роки тому

    please upload video on XGBoost

  • @subhanjalpant8824
    @subhanjalpant8824 2 місяці тому

    25:00

  • @siddharththorat8281
    @siddharththorat8281 3 роки тому

    Please upload video on xgboost

  • @subhanjalpant8824
    @subhanjalpant8824 2 місяці тому

    where is the dataset?
    Don't forget the dataset please it wastes time to create one

  • @mrityunjayupadhyay7332
    @mrityunjayupadhyay7332 2 роки тому

    Explicit Explanation

  • @zainabad5081
    @zainabad5081 4 місяці тому

    So gradient boost maths is nothin but adding and subtracting the columns, but this process is written in a hard mathematical language by some guys 🤣

    • @zainabad5081
      @zainabad5081 4 місяці тому +1

      Also taking some derivatives too W.R.T to the predicted value