Tutorial 42 - Ensemble: What is Bagging (Bootstrap Aggregation)?

Поділитися
Вставка
  • Опубліковано 28 гру 2024

КОМЕНТАРІ • 207

  • @shresthaditya2950
    @shresthaditya2950 2 роки тому +15

    0:06-Ensemble Techniques
    0:30-Bagging is also called bootstrap aggregation
    1:40-In Bagg we divide the data into various samples(on the basis of row sampling with replacement) depending on the number of ML models(Base Learners)known as Bootstrap and then the output which is in majority after running all the models is considered(Voting Classifier)also known as Aggregation
    2:53-Row Sampling with replacement
    4:02-
    4:50-BootStrap
    5:14-Aggregation

  • @shubhamjindal39
    @shubhamjindal39 4 роки тому +88

    The best explanation of Bagging. I logged in just to write this comment down. Keep it up. Thanks a lot!

  • @tanvirtanvir6435
    @tanvirtanvir6435 2 роки тому

    2:35 m

  • @rafipatel5020
    @rafipatel5020 Рік тому +2

    Sitting in my MSc AI class in London and watching him because he is just better!

  • @VaralakshmiAllu-g5t
    @VaralakshmiAllu-g5t 7 місяців тому

    You the people again proved that UA-cam is the best learning platform. Thank u so much sir for being a part of your UA-cam student❤

  • @denissobczyk9363
    @denissobczyk9363 2 роки тому

    just take a moment and appreciate the brilliance of this guy! Once again saved me from reading countless pages...

  • @sidharthsingh3399
    @sidharthsingh3399 3 роки тому +16

    This is just like "America got talent"/ "India got talent" show. Where we have participant(as data) performing his/her show and there are 4-5 judge(as model) and participant selection is based on judges review.
    Its one way perception to learn.

  • @Moonlit_girl73845
    @Moonlit_girl73845 4 роки тому +26

    The way you explained the concepts are very easy and understandable. Keep doing the same. Thanks a lot.

  • @karthikannavarapu8436
    @karthikannavarapu8436 Рік тому

    You are a god for the One day exam preparation students 🙏

  • @trknigatu
    @trknigatu Рік тому +1

    You have made it super clear. It shows your time investment on the subject. As the saying goes "If you can't explain it to a six-year-old, then you don't understand it yourself” Albert Einstein

  • @vishesharora2352
    @vishesharora2352 3 роки тому +4

    The video explains bagging extremely clearly. Thanks for the upload!

  • @channaly2772
    @channaly2772 3 роки тому

    I respect your simplicity and reserved a thumb up.

  • @midhileshmomidi2434
    @midhileshmomidi2434 5 років тому +6

    I think if I watch all your playlists I will definitely be confident that I will learn a lot

  • @samriddhlakhmani284
    @samriddhlakhmani284 4 роки тому +3

    PS : m>n is how it is. same sample is taken multiple times

    • @devmani100
      @devmani100 4 роки тому +1

      Because we are doing the selection with replacement. Example :- Suppose you have a bag with 10 red balls, now you draw 5 balls at a time but with replacement. So after first draw you put all the 5 balls back in the bag now for the next draw you will have 10 balls in total. That's why

  • @sonal4
    @sonal4 3 роки тому

    you explained everything in a very simple language......i always watch your videos for machine learning....thank you

  • @BalaMurugan-cb9ho
    @BalaMurugan-cb9ho 4 роки тому

    Krish, this kind of videos are I looking for. The way you are teaching is very much understandable. Thanks for your videos

  • @mehnazmaharin1645
    @mehnazmaharin1645 5 років тому +4

    please keep making videos. Your videos are easy to understand and clears up the concept!

  • @vaibhavkumar1509
    @vaibhavkumar1509 2 роки тому

    I am doing Master's in USA, thank you for this explanation.

  • @ellentuane4068
    @ellentuane4068 3 роки тому

    So easy to understand !!!! Thanks.. Greetings from Brazil

  • @singwithnoma
    @singwithnoma 3 місяці тому

    Woow your an amaizing teacher, Well done i have understood the concepts thank to you thank you very much.

  • @priyankakushwaha8407
    @priyankakushwaha8407 3 роки тому

    Thanks for explaining in simple words.

  • @nasreenbanu2245
    @nasreenbanu2245 2 роки тому

    finally i got the concept.hats off sir..

  • @Narsimhakhedkar
    @Narsimhakhedkar 3 роки тому

    Very well explained. I came here because my University professor totally messed up the explanation of this simple technique!

  • @_PremKharat
    @_PremKharat Рік тому

    One video and thats it concept clear
    Thanks a lot sir

  • @brendachirata2283
    @brendachirata2283 3 роки тому

    you just amazing, you have the sipmlest and clearest explanations

  • @sunilmali5380
    @sunilmali5380 4 роки тому +3

    You are champ!! what an explanation. Thank you so much sir

  • @alexmaingi9327
    @alexmaingi9327 Місяць тому

    great explanation in simple language!

  • @sonamde7507
    @sonamde7507 4 роки тому +2

    best explanation ever. You are really good at explaining things. Keep it up. looking forward to more detailed machine learning model videos. Thank you

  • @sneharj2036
    @sneharj2036 2 роки тому

    Thanku so much sir for a wonderful explaination. Concept of Bootstrap Aggregation is very clear n nicely told. Your channel is very awsome, great videos.

  • @ManikandanRaju
    @ManikandanRaju 2 роки тому

    This is wonderful. Thank you for such a simple and easy understanding explanation sir.

  • @karthikrajendran3394
    @karthikrajendran3394 Рік тому

    Well explained. Thank you. I was getting confused with the textual explanation.

  • @harshpathak754
    @harshpathak754 4 роки тому

    Best lecture on bootstrapping

  • @memonakhan9804
    @memonakhan9804 Рік тому

    This is actually a really nice explanation. Keep it up.

  • @mansibisht557
    @mansibisht557 4 роки тому +5

    You're so good at explaining !

  • @rajashekarappamadure8581
    @rajashekarappamadure8581 3 роки тому

    sir, you are soo good at simplifying and explaining the complex topics

  • @sandipansarkar9211
    @sandipansarkar9211 4 роки тому

    Superb video Keish. once again. Thanks

  • @GuitarreroDaniel
    @GuitarreroDaniel 3 роки тому

    You are the only one that explained this right, thank you very much!

  • @sridhar6358
    @sridhar6358 3 роки тому +1

    so to say
    a) a fit will be done with the model sample data of each model
    b) and there will be n such models for which fit will be done
    c) and prediction also will be done on each model and the
    d) result of each models prediction will be averaged if it is regression problem and
    e) voting shall be made if it is classification - how does voting go when we have a test set of data , we would have all the cases in the
    test model in case of classification then how does voting take place

  • @hydersal4073
    @hydersal4073 3 роки тому

    Greatly and deeply-explained. God bless and thanks a lot

  • @swatibogawat8368
    @swatibogawat8368 4 роки тому

    Ver Well and Simply explained Krish

  • @muhammadumair1280
    @muhammadumair1280 3 роки тому +1

    Sir are you sure in this example data m

  • @baharehghanbarikondori1965
    @baharehghanbarikondori1965 3 роки тому

    Awesome tutorial on BAGGING

  • @rkrish6476
    @rkrish6476 Рік тому

    Great Explanation .. Thanks Krish

  • @ayanmullick9202
    @ayanmullick9202 2 роки тому

    Thank you sir for easy explanation.

  • @fet1612
    @fet1612 5 років тому +1

    This video was very informative, very well explained. Please keep helping students who need help. Be good and take care.

  • @deepakkumarthakur8429
    @deepakkumarthakur8429 3 роки тому

    Loved it!!! 💙💙💙

  • @Kinglium
    @Kinglium 3 роки тому +4

    thank you so much for your hard work! this is by far the best explanation I could grab my head around! keep up your good work!!

  • @armansh7978
    @armansh7978 4 роки тому

    thank you very much for very good explanation Krish , wish the best for you

  • @jananikannan6401
    @jananikannan6401 2 роки тому +1

    Got a clear idea. One small suggestion. It would be nice if you explain the concepts based on some sample datasets consisting of a few rows and cols for explanation sake. That would give more detailed understanding.

  • @arpanpradhan493
    @arpanpradhan493 10 місяців тому

    Great Explaining! I am in Data Science Masters Program in Data mining class.

  • @blackphillip5757
    @blackphillip5757 4 роки тому

    You have a great channel man, keep up the good work.

  • @abhinav02111987
    @abhinav02111987 4 роки тому

    Excellent Krish.

  • @kin_1997
    @kin_1997 2 роки тому

    amazing content, thank you !

  • @keerthanavivin450
    @keerthanavivin450 3 роки тому

    Such a good explanation!

  • @rakeshp8711
    @rakeshp8711 4 роки тому +1

    Thank you for the explanation. It is not necessary that m should be less than n. It can be equal as well.

    • @chandrashekharpujari167
      @chandrashekharpujari167 4 роки тому

      You are absolutely right bro, In bagging we do not subset the training data into smaller chunks and train each tree on a different chunk. Rather, if we have a sample of size N, we are still feeding each tree a training set of size N (unless specified otherwise). But instead of the original training data, we take a random sample of size N with replacement. For example, if our training data was [1, 2, 3, 4, 5, 6] then we might give one of our trees the following list [1, 2, 2, 3, 6, 6]. Notice that both lists are of length six and that “2” and “6” are both repeated in the randomly selected training data we give to our tree (because we sample with replacement).

  • @suparnasaha3043
    @suparnasaha3043 2 роки тому

    Great explanation

  • @TheFofitas
    @TheFofitas 2 роки тому

    yes, very nicely explained. you are very clear, thank you! :)

  • @oguzcan7199
    @oguzcan7199 2 роки тому

    It was an amazing explanation! Thank you a lot.

  • @manpreetsharma3846
    @manpreetsharma3846 3 роки тому

    Amazing video!

  • @t-ranosaurierruhl9920
    @t-ranosaurierruhl9920 5 років тому +2

    Perfect explanation!

  • @kmdkhaleeluddin6257
    @kmdkhaleeluddin6257 Рік тому

    Your explanation is awesome ❣️ and thank you so much for making these video for us
    I request you to provide your full notes of machine learning so it could be so easy for us ❣️✨it may possible to get high score in machine learning ❣️
    Thank you once again ❣️✨.....

  • @rishi.m7160
    @rishi.m7160 3 роки тому

    thanks a lot ur session was helpful .

  • @daohoang5973
    @daohoang5973 3 роки тому

    Indian guys literally save the rest of the world and do the explanation job better than my teacher !

  • @anubhavnehru619
    @anubhavnehru619 3 роки тому

    Amazing explanation!

  • @isratjahan207
    @isratjahan207 4 роки тому

    Nice explanation. Thank you!

  • @prashanths4455
    @prashanths4455 3 роки тому

    Awesome krish

  • @odelolatechup1447
    @odelolatechup1447 Рік тому

    I love your videos thank you

  • @prodyutdas1474
    @prodyutdas1474 3 роки тому

    you are awesome!! Thanks a lot.

  • @yoshitha12
    @yoshitha12 Рік тому

    Very clear ❤

  • @nandeeshkm3293
    @nandeeshkm3293 2 роки тому

    thank you so much ❤

  • @louerleseigneur4532
    @louerleseigneur4532 3 роки тому

    Thanks Krish

  • @Pidamoussouma
    @Pidamoussouma 4 роки тому

    This is good explanation

  • @biswadeepdutta2225
    @biswadeepdutta2225 3 роки тому

    Gem of a tutorial!!!

  • @anjumanoj4703
    @anjumanoj4703 3 роки тому

    Nice explanation

  • @codeandcurious
    @codeandcurious 3 роки тому

    Nice explaination

  • @anujasebastian8034
    @anujasebastian8034 3 роки тому

    Excellent!!!

  • @rambaldotra2221
    @rambaldotra2221 3 роки тому

    Thanks A Lot Sir!!

  • @aditya_01
    @aditya_01 3 роки тому

    really nice video thank u

  • @asthakumari2104
    @asthakumari2104 21 день тому

    would you please link which playlist is this video a part of?

  • @pranjalgupta9427
    @pranjalgupta9427 3 роки тому +1

    Thanks ❤

  • @mostafakhazaeipanah1085
    @mostafakhazaeipanah1085 2 роки тому

    Thanks a lot, Mr. Naik.
    I have two questions.
    the first one is, can we use neural networks instead of decision trees?
    The second one is, m must be greater or equal to or less than n?

  • @radhay4291
    @radhay4291 3 роки тому

    Good Explanation, is this m1,m2,m3 models are same classifiers or different classifiers

  • @shadiyapp5552
    @shadiyapp5552 2 роки тому

    Thank you ♥️

  • @shubhamchaudhari1212
    @shubhamchaudhari1212 Рік тому

    When the base learners are created i.e different models, will the modles be only the ones that are comparible to solve Binary model as you said to cinsider the modle as Binary Regresion ?

  • @rrrprogram8667
    @rrrprogram8667 4 роки тому

    Subscribed.... For ur dedication

  • @datascientist2958
    @datascientist2958 4 роки тому +1

    What is the difference between bagging, boosting, voting and stacking? Random Forest and XGB can be related with which method

  • @imtiaznakib1040
    @imtiaznakib1040 3 роки тому

    Really helpful

  • @RajKumar-mndr
    @RajKumar-mndr 5 років тому +3

    Hi
    I am from none technical background and working in bpo in backend process not related to any technique, simply cut copy paste
    But looking forward for SAS ANALYTICS,
    PLS SUGGEST WHICH TECHNIQUE NEED TO LEARN, ALSO SHARE SOME VIDEOS LINK

  • @self-made-datascientist1181
    @self-made-datascientist1181 4 роки тому +1

    I dont get why not train all the models with all the training data available? If you seperate the data randomly the original distribution will get affected and models will be randomly good or bad depending on how lucky they were to get a close distribution in the splitting

    • @pratikbhansali4086
      @pratikbhansali4086 4 роки тому

      With whole data only one model can be created bro if we chose a sample from a given dataset then only everytime our model will give slightly different results so we combine the results of diff model

  • @sowjanyagodha7979
    @sowjanyagodha7979 3 роки тому

    you said about sampling of rows that means it differs in rows does that mean each subset of original dataset has same columns which are in original dataset.

  • @aishwaryamundhe5070
    @aishwaryamundhe5070 3 роки тому

    damn this is an excellent explanation

  • @alihussien7935
    @alihussien7935 10 місяців тому

    Thanks for all your good video. Your explanation are very good. But you don't tell ps , when we used and why. What is the goal?

  • @_MubinShaikh
    @_MubinShaikh 3 роки тому +1

    Best video

  • @Noonewknows
    @Noonewknows 11 місяців тому

    Could you teach us how to calculate uncertainty in regression model for each test data set

  • @thirupathireddy6149
    @thirupathireddy6149 5 років тому +6

    Krish...Explain how to reduce an error for Regression and classification models ,,,Thanks

    • @aakashsinghrawat3313
      @aakashsinghrawat3313 4 роки тому

      go through EDA and feature engineering part, maybe improving them may reduce error.

    • @vijethrai2747
      @vijethrai2747 4 роки тому

      Understand the mathematics behind the models

  • @sumanbindu2678
    @sumanbindu2678 3 роки тому

    should it not be n >= m instead of n>m ?

  • @s.shanmugapriyacse7044
    @s.shanmugapriyacse7044 2 роки тому

    Very good explanation. A doubt if so one could clarify it will be help ful.. In Testing data set , will it also be given different test data for each model

  • @UCS__Yogesh
    @UCS__Yogesh 3 роки тому

    What if we use regression technique ?? Will it still use majority voting or it will take average of all the base models ??

  • @etikh404
    @etikh404 4 роки тому

    Sir, you are simply too much!

  • @muditmathur465
    @muditmathur465 4 роки тому

    Sir when you say at 3:09 row sampling with replacement, you mean that some values can be repeated in different D'm ? So why not call it Row Sampling with Repetition ?

    • @tabishsayed
      @tabishsayed 3 роки тому +1

      "with replacement" is a statistical lingo for the same thing you mentioned, specifically in terms of probability. Hope this clarifies.