Believe me Your teaching Style and points is really amazing i follow Some AI engineering channel but this ine is one of the best. Could you please Do 100 day for Deep learning it would be better for us.thanks
Sir belive me the intution u give before explaining any tecnique makes videos way more interesting and easy to follow for viewers. quality content u providing.
Your explanations are simplified and very easy to understand. The suject is made to look so simple in your video. Many thanks. Please continue your good work for the benefit of people/learners like me.
I’m a great fan of your content, sir, and I truly appreciate the value it brings to learners worldwide in the fields of Machine Learning and Deep Learning. I have a small request: as your videos are watched by people from various backgrounds, it would be incredibly helpful if more of the content were delivered in English. This would make it easier for a broader audience, like myself, to follow along and fully benefit from your teachings. I hope that, from your upcoming series on PyTorch and Generative AI, we might see more content in English. Thank you very much for considering this suggestion!
Thank you very much for this video..after exploring so many videos on UA-cam..finally got a video with simple and easy explanation of gradient boosting..🙏🏻
I don’t know, how could I express my gratitude for the hard work and the simplicity of concept presentation, you did❤❤❤you deserve more subscribers 💐💐💐
Amazing explanation!!! I regret that I found this video very late... Keep up the good work and please continue to do so. Thank you so much for this wonderful content.
Thank you for the amazing explanation, sir. Please continue the series with boosting algorithms and then with deep learning series to sir(100 days of Neural Networks).
Wow! Great teaching skills. You explain clearly. Normally, people use animation and/or slides to explalin these complex algorithms while you are using white board. Somehow, it is very appealing and easy to understand with the white board. Overall, hats off to your teaching method and teaching skills. Thank you 🙃🙃
just wanted to point out a correction. The loss function used in gradient boosting is not (actual-predicted). The loss function is actually square residuals-> 0.5*(actual-predicted)^2, when we take derivative of this(i.e. gradient) it becomes (actual-predicted), which is then learnt by the next estimator.
Hi Nitish, thanks for your videos. They are very fundamental. Do you have videos in Time serier, demand forecasting, market basket, recommendation, and sentiment analysis?
Video is great. Only One confusion. In Your practical implementation where to use learning rate. Either with (y_pred) or df['pred2']. Or we have to use it with both
Very nice video! One confusion though: you said residual should ideally be zero. But wouldn't that be the case of overfitting. I understand what you are trying to say but having zero residual each time is also not desirable, isn't it? Is it better to say residual should be as small as possible, instead of zero?
I have a serious question. Doesn't Gradient Boosting cause data leakage? As the first model is generating an output by using the information from the target i.e. mean(Target). So it will perform well regardless right?
Such an underrated channel.
You don't know how great you teach these topics.
Very well done.🔥🔥
Much much better than Krish Naik Sir.
krish naik not at his men level
Thank you, man. My journey toward learning ML would have been so difficult without your playlist and videos. Thank you so much.
Believe me Your teaching Style and points is really amazing i follow Some AI engineering channel but this ine is one of the best.
Could you please Do 100 day for Deep learning it would be better for us.thanks
Sir belive me the intution u give before explaining any tecnique makes videos way more interesting and easy to follow for viewers. quality content u providing.
Your explanations are simplified and very easy to understand. The suject is made to look so simple in your video. Many thanks. Please continue your good work for the benefit of people/learners like me.
Sir you are amazing... Your way of explanation is of another level 👍🏻👍🏻
I’m a great fan of your content, sir, and I truly appreciate the value it brings to learners worldwide in the fields of Machine Learning and Deep Learning. I have a small request: as your videos are watched by people from various backgrounds, it would be incredibly helpful if more of the content were delivered in English. This would make it easier for a broader audience, like myself, to follow along and fully benefit from your teachings. I hope that, from your upcoming series on PyTorch and Generative AI, we might see more content in English. Thank you very much for considering this suggestion!
Thank you very much for this video..after exploring so many videos on UA-cam..finally got a video with simple and easy explanation of gradient boosting..🙏🏻
I don’t know, how could I express my gratitude for the hard work and the simplicity of concept presentation, you did❤❤❤you deserve more subscribers 💐💐💐
best n most convenient way you are teaching such important topic, couldn't find any better explanation for this on youtube. thanks Nitish sir
Amazing explanation!!! I regret that I found this video very late...
Keep up the good work and please continue to do so. Thank you so much for this wonderful content.
Its never too late brother ...unless you are deathbed...haha
You're a great teacher but too underrated 😢
Your lecture was fat better than other lecturer. Got some clear insights and was able to Truncate my confusion.
You always teaches in a best way. Lots of love from Pakistan.
You are the best teacher for machine learning..Thank you Sir 🌟✨🙏💛
Great explanation than any other youtube channel.
Thank you for the amazing explanation, sir.
Please continue the series with boosting algorithms and then with deep learning series to sir(100 days of Neural Networks).
Thank you sir. It was a really good explanation. It helped me a lot and I got an 'A' grade in ML.😊
The explanation is really awesome!!!
Best Teacher ever!!
thank you mate for your valuable content. keep it up!
Wow! Great teaching skills. You explain clearly. Normally, people use animation and/or slides to explalin these complex algorithms while you are using white board. Somehow, it is very appealing and easy to understand with the white board. Overall, hats off to your teaching method and teaching skills. Thank you 🙃🙃
Thanks!
very clear and informative
Concepts are explained so clearly.
Thanks for the video. Explained everything very well.
what a beautiful explaination , Thanku sir
excellent explanation..yes much better content from other channels.
Too great dada …
bro these videos are just too awesome. thanks for such amazing content.
so glad i found this channel
gem video..your python knowledge is great sir
Thank you so much sir🙏🙏🙏
thanks sir for your great explanation
Hats off !! So nice and crystal clear explanation
Hatts off to you sir
Also Nice Mehandi sir !
Best explanation 👌
Really a great explanation of the topic. Would surely recommend this.
The audio quality makes it very hard to understand. Putting up blankets or something similar to dampen the reverb would help a lot.
love you sir!!! such beutiful explanation....thanks a lot
just wanted to point out a correction. The loss function used in gradient boosting is not (actual-predicted). The loss function is actually square residuals-> 0.5*(actual-predicted)^2, when we take derivative of this(i.e. gradient) it becomes (actual-predicted), which is then learnt by the next estimator.
Amazing explanation.... Keep it up...
Thanks Nitish.
Hi Nitish, thanks for your videos. They are very fundamental. Do you have videos in Time serier, demand forecasting, market basket, recommendation, and sentiment analysis?
Great Goingn Sir Your Consistency will make you the best at some point of time in your life
Please post LightGBM and Catboost intuition + hands-on with hyperparameter optimization in the upcoming videos.
Will do
Yes sir please cover them ASAP.
You are only person who are explaining each and every thing in very very clear way.
Keep Growing❤
Video is great. Only One confusion. In Your practical implementation where to use learning rate. Either with (y_pred) or df['pred2']. Or we have to use it with both
Very good explanation
Thank You Sir.
Good Explanation , Thanks a lot
Amazing explanation
Very nice video! One confusion though: you said residual should ideally be zero. But wouldn't that be the case of overfitting. I understand what you are trying to say but having zero residual each time is also not desirable, isn't it? Is it better to say residual should be as small as possible, instead of zero?
Yes
Great explaination man ,
Could you please make videos for classification using gradient boosting? Also Xgboost.
Hello Nitish Ji have you not uploaded any videos on Gradient Boosting for Classification..pls send the links as i was not able to find it out.
Sir plz provide some ideas for applying ensemble techniques on interpretable methods such as CAM and it's varients...
Thank you so much 👍😊
Sir apki NLP series kb complete hogi i am waiting for next videos of this (ML) series.
Could you please upload a video on Gradient Boost classifier. I have followed other videos but I'm confused about using log(odds) concepts.
Sir PLZZ 🙏 upload video on xgboost and DBSCAN algorithm
Will do it after nlp series
@@campusx-official Sir PLZZ 🙏 upload video on xgboost and DBSCAN algorithm
What is the no of observations in the root node of each decision tree while using Random Forest!
I have a serious question. Doesn't Gradient Boosting cause data leakage?
As the first model is generating an output by using the information from the target i.e. mean(Target). So it will perform well regardless right?
Please, help none hindi students to create English course please................. You are the best I will like to learn from you please.
Feedback taken Sodiq. Will do it in sometime future
Can u please make videos how to get shortlisted in data sc. companies. how to make an end to end Data Sci. project.
3:12 ye haat pe konsa graph hai?
U have trained DT in laptop, in exam how we will make the DT for GB problem?
Excellent...
sir, this topic for classification is remaining
Sir please upload the video on xgboost
Sir please do the project of sports celebrity one... I am waiting for it sir
nice video
bro crash course banwa do ML ka
ya List Of Topics hi bata do jo seekh k mai Projects bana pau is channel ke!
You can checkout this playlist - 100 Days of Machine Learning: ua-cam.com/play/PLKnIA16_Rmvbr7zKYQuBfsVkjoLcJgxHH.html
@@campusx-official bro this is too huge to be followed. I hope you can relate!
please share List of Topics which are used in Projects repeatedly.
@@inspired_enough_to_grow_up will trim it down then
@@campusx-official thanks!
Sir, xgboost kab aayega
how much model we can add?
Sir the video is very good, but as a non hindi student, it is difficult to follow, so in future please make the videos completely in English.
Nitish Why there is no XG Boost video? 😶 Mann nahi kisi aur se padhne ka :)
awesome
sir please upload on xgboost video
You said you will upload 7 videos please upload them
finished watching
sir please do create Gen Ai coursein UDEMY
Thank you sir
Why you have generated 500 values of x (x_test) , you could have used df['x'] instead .. why this manipulation of code..
did you get the ans
i got same dount
sound problem
thank you 3000
👌👌👌
👍
Just like Alakh Pandey for jee , now I have another teacher for ML with white board and marker🤖🤖
if it's in english everyone can access it
This was very good . Thanks !!
waiting for your 100K
Thank you sir
Welcome