If you want to learn more about the bias-variance trade off (an important aspect for both bagging and boosting methods), make sure to check out this video: ua-cam.com/video/5mbX6ITznHk/v-deo.html
Great video. My understanding is that you would almost always use Bagging, evaluate the results and, if good enough, stop there. However, you COULD go on to try various boosting methods to see if the model improved even more but at what cost? If the best boosted model (Adaboost, XGBoost, etc) performed 1% better but took 3x longer to compute then boosting the already-bagged models might not be worth it right? Still trying to cement in my mind the process flow from a developer standpoint 😉
My dad told me that my girlfriend, who is a data scientist, is a "double bagger". What does that mean? Does performing bagging twice before training mean that you're a pro?
If you want to learn more about the bias-variance trade off (an important aspect for both bagging and boosting methods), make sure to check out this video: ua-cam.com/video/5mbX6ITznHk/v-deo.html
Neat explanation in 4mins. Keep making small and informative videos like these :)
Thank you so much for your feedback and I am happy you enjoyed it! I will definitely keep making such videos. :)
Crisp visualization & explanation. Loved it!
Thank you! Glad to hear that you liked the explanation! :)
Watching this one day before exams!.. neatly explained
Thanks! Best of luck to your exams! :)
I’m learning ML and these videos are fantastic
Thanks! Glad you think so! :)
Just stumbled across your video, I am looking forward to watching all your videos 😄
I hope you enjoy my content on this channel. Please let me know if you have any feedback on my videos! :)
Great video. My understanding is that you would almost always use Bagging, evaluate the results and, if good enough, stop there. However, you COULD go on to try various boosting methods to see if the model improved even more but at what cost? If the best boosted model (Adaboost, XGBoost, etc) performed 1% better but took 3x longer to compute then boosting the already-bagged models might not be worth it right? Still trying to cement in my mind the process flow from a developer standpoint 😉
this was dope, color-coding kinda threw me off but the overall explanation was nice and concise.
Thanks! Glad you enjoyed it! :)
great video, loved the pictures. a caveman like myself loves the pictures. thank you lol
Sweet thanks! I am super happy you enjoyed the explanation! :)
simple and easy to understand, nice
Thanks! Glad you found the explanation this way! :)
Very good explanation :)
Thanks! Glad you liked it! :)
It was clear. Thanks
You are welcome! Glad you liked it! :)
thanks for easy explaination brother
Always welcome! Happy you enjoyed the explanation! :)
Awsome video! Keep going!
Many thanks!!
Great explanation ! , Can you share the slides ?
Thanks! Unfortunately I don't have the slides anymore. Changed my laptop and forgot to transfer them. :(
awesome video
Thanks! Glad you liked it! :)
Let me "boost" this video by making a comment
Many thanks! I hope you also enjoyed the video! :)
No
thank you very much
You're welcome! :)
Good Video.
Thanks! Glad you enjoyed it! :)
Do you recommend any python libraries for implementing hybrid ml modern that uses boosting for a meta-model? 👏🏼
Thanks for the question! Unfortunately, I don't know about any python libraries that can be used specifically for this. :(
anyone got the answer of WHYS?
Boosting using whole set of data?
Yes, why not. in bagging you sample with replacement, but in boosting you can use the whole traiining data. :)
Plsss job dyaa
hi, are you romanian?
yep, how did you figure that out? :)
the accent :)) big well done and please more videos with machine learning, data science , data analytics :)@@datamlistic
Unclear understanding
Why was it unclear? Could you provide a bit of feedback?
My dad told me that my girlfriend, who is a data scientist, is a "double bagger". What does that mean? Does performing bagging twice before training mean that you're a pro?