Bootstrap aggregating bagging

Поділитися
Вставка
  • Опубліковано 2 січ 2025

КОМЕНТАРІ • 56

  • @woodgecko106
    @woodgecko106 7 років тому +116

    only person who explained this properly imo

  • @ezekiel763
    @ezekiel763 7 років тому +19

    Wow - great explanations on concepts most other people take 30 mins to explain with half the effect. Subscribed!

  • @itssidhere
    @itssidhere 4 роки тому +2

    The best explanation of what the word replacement really means. Thanks

  • @cshewale
    @cshewale 5 років тому

    he explained it in very simplistic manner.. Thanks

  • @peiwang3223
    @peiwang3223 5 років тому +4

    really precise and easy understanding explanation, thank you, sir! really save my life :)

  • @aknka8576
    @aknka8576 4 роки тому +4

    Why does it feel like Ron Swanson whose teaching? 😀 Well explained concept!!

  • @infatum222
    @infatum222 5 років тому

    This is the best explanations on bagging I've ever seen. Many thanks 👍🙌👌

  • @hk91v29
    @hk91v29 4 роки тому

    Your explanation makes it very understandable!

  • @juliocjacobo
    @juliocjacobo 3 роки тому

    Great explanation: concise and direct.

  • @EvilSpeculator
    @EvilSpeculator 8 років тому +2

    Very concise explanation - thanks for posting this!!

  • @MaeLeong
    @MaeLeong 4 роки тому +4

    Random forest - randomly selects the samples as well as the features for each tree model .

  • @jingyiwang5113
    @jingyiwang5113 10 місяців тому

    Thank you so much for such a great explanation!

  • @saadmunir1467
    @saadmunir1467 4 роки тому

    Thats awesome explanation in a nutshell

  • @y4eHuk-3
    @y4eHuk-3 4 роки тому

    Wow, so amazing explained!

  • @armansh7978
    @armansh7978 4 роки тому

    it was really understandable and easy way explanation. thank you

  • @siyuanyan7727
    @siyuanyan7727 5 років тому

    excellent explanation - many thanks, sir

  • @vishank7
    @vishank7 4 роки тому

    This is soo awesome! Thank you!

  • @pingyu588
    @pingyu588 4 роки тому

    very good explanation!

  • @donfeto7636
    @donfeto7636 3 роки тому

    The sub-sample size is always the same as the original input sample size but the samples are drawn with replacement. if you have 60% training data then the subsample should have 60% of the data not from the training data which is 36% of the data

  • @JohnSmith-ok9sn
    @JohnSmith-ok9sn 3 роки тому

    Thank You!
    (In under 3 minutes!)

  • @h_4943
    @h_4943 2 роки тому

    but scikit-learns RF uses n for each bag with or without replacement is that something new?

  • @Skandawin78
    @Skandawin78 6 років тому +1

    What algorithm is used in the model step. Is this applicable for any algorithm?

    • @charismaticaazim
      @charismaticaazim 6 років тому +1

      Decision trees are used & an ensemble of decision trees make a forest.

  • @YazminAbat
    @YazminAbat 4 роки тому

    georgia tech quality!!! thanks :))

  • @fayechang8647
    @fayechang8647 Рік тому

    very clear!! Thanks !!

  • @TigasFMS
    @TigasFMS 6 років тому

    Great video. I have two questions. Since the data can be the same in the same subset it can also be exactly the same (although the chance is low ofc) on another subset right? In the case of random forests, since features are also selected randomly from the dataset, can they also be the same in different decision trees? Thanks in advance.

    • @charismaticaazim
      @charismaticaazim 6 років тому

      I'd try to answer this. For your first question, yes it's possible to have the same data points in different subsets.
      For random forest, since the features are selected randomly, it again possible.

  • @66saly
    @66saly 6 років тому

    well explained, thank you very much

  • @mahimsd7645
    @mahimsd7645 6 років тому

    very well done.... gr8

  • @syedtahaaziz240
    @syedtahaaziz240 7 років тому

    so we use single algo for all different bags? or different learners for different bags ?

    • @Cristiprg
      @Cristiprg 7 років тому +2

      In practice, we use single algo for all different bags. For example, have a look at BaggingClassifier in sklearn, there you set one single base_estimator, i.e. learning algorithm.
      However, nothing is stopping you from using different algorithms for different bags and comparing the results. I think you would lose some important mathematical guarantees though, but I'm not entirely sure.

  • @yifancao703
    @yifancao703 6 років тому

    extremely clear

  • @hamzahsajidkhan3602
    @hamzahsajidkhan3602 6 років тому

    Could someone please explain to me what he meant by "you can simply wrap this up in an API"? How to do that and how does it help?

    • @charismaticaazim
      @charismaticaazim 6 років тому +1

      He meant you can use libraries that have implemented random forest, like scikit-learn

  • @abhijitjantre8427
    @abhijitjantre8427 8 років тому

    Well explained.

  • @goodyoyo0214
    @goodyoyo0214 8 років тому

    Where does the unknown X come from?

    • @-long-
      @-long- 8 років тому +2

      X is from the testing set (blue color)
      We test X with all models and pick the one that gets the majority of voting

    • @-long-
      @-long- 8 років тому

      or in this case, the mean Y

  • @ulisesmunera1641
    @ulisesmunera1641 6 років тому

    Thanks a lot!

  • @alexmatt4012
    @alexmatt4012 2 роки тому

    Listen at 5X speed.

  • @axe863
    @axe863 3 роки тому

    Note: Standard Bagging is class imbalance, instance importance and feature importance insensitive.

  • @bananesalee7086
    @bananesalee7086 5 років тому

    Weird my school teaches n' should be the same size as n. Anyways it doesn't change that much i guess

  • @TheIanoTube
    @TheIanoTube 4 роки тому

    Cheers

  • @duonghi6986
    @duonghi6986 9 місяців тому

    2:04

  • @zhenghuafu4297
    @zhenghuafu4297 7 років тому

    Hi what tablet is it? Is it wacom? Will that be very expensive??

  • @pepe6666
    @pepe6666 5 років тому

    those bags look like slug aliens from mars. also i sneezed while watching this which proves illuminati did 9/11

  • @anonymousanonymous5411
    @anonymousanonymous5411 7 років тому

    Nice explanation

  • @JustMe-pt7bc
    @JustMe-pt7bc 8 років тому

    In every video, what you explain is fine but is not enough to explain the topic of your header

    • @rahulbalan
      @rahulbalan 7 років тому +1

      Because its a lesson series. The entire lesson is available on Udacity, if they haven't uploaded it on UA-cam.

  • @astraymessiah8594
    @astraymessiah8594 4 роки тому

    save my life

  • @mingzhouzhu4668
    @mingzhouzhu4668 5 років тому +1

    pretty clear explanation but his voice makes me asleep