After learning from you I have realized how so called experienced teachers are teaching wrong stuff in big institutions. You're doobte huye ka sahara. Thanks Nitish
So called big youtubers and institutes of India have explained the topic completely wrong. I will not name him or them. Thank you for this explanation. I was stuck after reading boosting in wiki but the way you explainesd now everything is clear. Simply awesome. One request please create a video on XgBoost and please explain how regualrization is working in xgBoost.
@@ankitbiswas8380Bhai Saab 🤣 I'm so glad I came across this channel because I first visited the 'he who should not be named' ka channel and i got so confused. Nitesh Singh ki jai Bhai!!
No one has taught boosting simpler than u did . hats off to you and the amount of efforts you have put in explaining with graphs. This is the first time i understood additive modelling is sum of multiple smaller functions .
I’m a great fan of your content, sir, and I truly appreciate the value it brings to learners worldwide in the fields of Machine Learning and Deep Learning. I have a small request: as your videos are watched by people from various backgrounds, it would be incredibly helpful if more of the content were delivered in English. This would make it easier for a broader audience, like myself, to follow along and fully benefit from your teachings. I hope that, from your upcoming series on PyTorch and Generative AI, we might see more content in English. Thank you very much for considering this suggestion!
Additive modeling is a statistical technique for modeling complex relationships between variables by breaking them down into a sum of simpler relationships. The idea behind additive modeling is to add up simple functions of the predictors to model the response, rather than attempting to model the response as a complicated function of the predictors. An example of additive modeling is modeling the relationship between temperature, rainfall, and crop yield. Instead of trying to model the relationship between these variables as a single, complex equation, an additive model would break it down into three separate, simple relationships: the relationship between temperature and crop yield, the relationship between rainfall and crop yield, and the relationship between temperature and rainfall. These separate relationships can then be added together to give a final model of the relationship between temperature, rainfall, and crop yield.
As said for boosting we take pseudo residuals which is y minus y hat not taking square of them. then why we do consider the loss function of least square function which is square of difference of actual and prediction in finding the calculation of terminal leaf vale in decision function. At the end as we have seen the calculated values for the end leaf of formula is similar to that of the coded decision tree which you should be differ in certain value. but in this case we got the exact same value. what could be the other loss function we can use here. by the way a great job, you made so easy to understand the math behind every machine learning algorithm. keep do continue to inspiring people. thank you so much.
Bhai, I have created my Full application. But the problem is that I can't create Recommandation System in my App. When User open app they see same post always. Please create Post Recommandation System using tflite in java or kotlin 🙏
Statistical Learning by Stanford. Please don't dive too much in this book, it's researcher level book with Advanced Mathematical notations and concepts. You'll get overwhelmed as a beginner just like me. ✨
After learning from you I have realized how so called experienced teachers are teaching wrong stuff in big institutions.
You're doobte huye ka sahara.
Thanks Nitish
So called big youtubers and institutes of India have explained the topic completely wrong. I will not name him or them. Thank you for this explanation. I was stuck after reading boosting in wiki but the way you explainesd now everything is clear. Simply awesome.
One request please create a video on XgBoost and please explain how regualrization is working in xgBoost.
Khal-naik 🤣 ?
@@ankitbiswas8380Bhai Saab 🤣 I'm so glad I came across this channel because I first visited the 'he who should not be named' ka channel and i got so confused. Nitesh Singh ki jai Bhai!!
I wonder why you are so underrated! you deserve more hype and subscribers
No one has taught boosting simpler than u did . hats off to you and the amount of efforts you have put in explaining with graphs. This is the first time i understood additive modelling is sum of multiple smaller functions .
I’m a great fan of your content, sir, and I truly appreciate the value it brings to learners worldwide in the fields of Machine Learning and Deep Learning. I have a small request: as your videos are watched by people from various backgrounds, it would be incredibly helpful if more of the content were delivered in English. This would make it easier for a broader audience, like myself, to follow along and fully benefit from your teachings. I hope that, from your upcoming series on PyTorch and Generative AI, we might see more content in English. Thank you very much for considering this suggestion!
Additive modeling is a statistical technique for modeling complex relationships between variables by breaking them down into a sum of simpler relationships. The idea behind additive modeling is to add up simple functions of the predictors to model the response, rather than attempting to model the response as a complicated function of the predictors.
An example of additive modeling is modeling the relationship between temperature, rainfall, and crop yield. Instead of trying to model the relationship between these variables as a single, complex equation, an additive model would break it down into three separate, simple relationships: the relationship between temperature and crop yield, the relationship between rainfall and crop yield, and the relationship between temperature and rainfall. These separate relationships can then be added together to give a final model of the relationship between temperature, rainfall, and crop yield.
Thank You Sir.
As said for boosting we take pseudo residuals which is y minus y hat not taking square of them. then why we do consider the loss function of least square function which is square of difference of actual and prediction in finding the calculation of terminal leaf vale in decision function.
At the end as we have seen the calculated values for the end leaf of formula is similar to that of the coded decision tree which you should be differ in certain value. but in this case we got the exact same value.
what could be the other loss function we can use here.
by the way a great job, you made so easy to understand the math behind every machine learning algorithm. keep do continue to inspiring people.
thank you so much.
he has clearly mentioned the reason of using LS loss function over other loss functions.
At 47:28, the derivative of f0(x) wrt Y (Gamma) is not zero. Can you please check it again. Since f0(x) is also depend on Y (Gamma).
Great video. Decoding each step and making it easy to understand. Best explanation. Thank you very much for such an amazing content on your channel🙌🙌
XGBoost Regression and Classification Next topic please and LightGBm and CatBoost next
Sir, your work is awesome, i learnt almost machine learning algorithms from your channel. Thank you very much, if possible, please do the XG boost
24:53 step 2. (a)
41:54 step 2. (c)
Bro you are great!! superb explanation
Very useful. Enjoying your videos. Great work
Is there no learning rate involved in the final formula of output of Gradient Boosting?
Thanks for the video. Which book are you refering to in the video: ISLR or Elements of Statistical Learning?
can you please provide the reasearch paper link?
great work sir 🤩
Please make a video of Gradient Boosting for Calssification too. Thanks!
Brother Very well explained!
Bhai, I have created my Full application. But the problem is that I can't create Recommandation System in my App. When User open app they see same post always. Please create Post Recommandation System using tflite in java or kotlin 🙏
very nice sir , thanks
sir please upload video of Gradient boosting on classification data
very informative video
Please add xgboost and lightgbm (classification and regression)
Sir, TYSM for the lecture. Can you provide the code dataset ?
Excellent
brother i cant find the video on gradient boosting classification
Great job
Can anyone please tell me, what exactly is gamma in point c), i am having trouble understanding it
Gamma was yi hat (predicted value)
If you understand, step one properly step 2C is similar.
correction in the d) update formula there must be a learning rate we have to add
that use in i think use larning rate in classification
@@Abhishek-qw6ny no learning rate is used when there is overfitting. Does not matter if it is regression or classification.
He did not use learning rate because the sample size was just for explanation(3) but yes, I think you are right. We do need to use learning rate
please disclose the name of the book from which the algorithm is taken.
Statistical Learning
Statistical Learning by Stanford. Please don't dive too much in this book, it's researcher level book with Advanced Mathematical notations and concepts. You'll get overwhelmed as a beginner just like me. ✨
Sir Association or baki unsupervised or reinforcement learning pr b videos bnaye na
Sir apko formula yaad rehte hai kya mujhe sirf working yaad rehti hai algo ki formule nhi kya yeah sahi hai ya nahi
Mujhe bhi nai yaad rehte yaar
Sir please make video on SVM and Naive Bayes..
Naive Bayes: ua-cam.com/play/PLKnIA16_RmvZ67wQaHoBuzXaDAfPz-a6l.html
finished watching
Please make a video on xgboost & adaboost
Adaboost (Updated): ua-cam.com/play/PLKnIA16_RmvZxriy68dPZhorB8LXP1PY6.html
sir please upload video on gradient boost classifier
sir classification vala part baki hai ?
please upload video on XGBoost
25:00
Please upload video on xgboost
where is the dataset?
Don't forget the dataset please it wastes time to create one
Explicit Explanation
So gradient boost maths is nothin but adding and subtracting the columns, but this process is written in a hard mathematical language by some guys 🤣
Also taking some derivatives too W.R.T to the predicted value