7.4 Boosting and AdaBoost (L07: Ensemble Methods)

Поділитися
Вставка
  • Опубліковано 6 січ 2025

КОМЕНТАРІ •

  • @kairiannah
    @kairiannah Рік тому +2

    Thank you for these ML videos! I will buy your book to support your work

  • @dragosmanailoiu9544
    @dragosmanailoiu9544 4 роки тому +2

    Thank you for the clarifications I just finished this part in your book
    and thank you as well for the stacking extra lecture

  • @newbie8051
    @newbie8051 Рік тому

    Simple and complete explanation, thanks prof !!!

  • @mahmoudsalhab3007
    @mahmoudsalhab3007 4 роки тому +2

    Thanks for these super helpful and amazing tutorials!
    can't wait for the rest of the course ♥!

  • @ayushdudedon
    @ayushdudedon 9 місяців тому

    Hi Sebastian,
    liked the videos and the detail. One thing i noticed for calculating alpha the function which is used, it will give the more importance to the weak learner with higher rate as well. Only thing that will happen we will be getting a negative sign attached to the output leading to choosing other class(assuming binary classification in classes (-1,1)). in the video, at 27:08, you mentioned classifier with high error is not important for prediction. Let me know if i am missing something.

  • @negarmahdavi4330
    @negarmahdavi4330 9 місяців тому

    26:17, Misclassified instances gain higher weights: so the next classifier is
    more likely to classify it correctly. I think what you said was the opposite of this which doesn't seem correct.

  • @noureddineadjir
    @noureddineadjir 5 місяців тому

    Thank you for the details.
    You haven't used the validation set.

  • @sinanstat
    @sinanstat 4 роки тому +2

    This looks like a great method. However, during the iterative "weighting" process, there are too many fixes. Wouldn't that result in an over-fitting issue?

    • @SebastianRaschka
      @SebastianRaschka  4 роки тому +2

      I practice, I would say it is not overfitting more than other algorithms, necessarily. It's actually better than most non-ensemble classifiers but that might be something to look into on a selection of datasets. In practice, I think why it doesn't suffer from overfitting that much is that a) the decision trees are still just 1-level deep and b) you consider the ensemble of decision trees from the different rounds instead of just the last tree.

    • @sinanstat
      @sinanstat 4 роки тому

      @@SebastianRaschka It makes sense, thank you very much for the explanation!

  • @shanurmilon5433
    @shanurmilon5433 Рік тому

    Thanks Man. It’s really a good tutorial ❤.

  • @RahulPrajapati-jg4dg
    @RahulPrajapati-jg4dg 3 роки тому +3

    Sir what even pdf and link which you are mentioning the lectures, that link please mention the here sir

    • @SebastianRaschka
      @SebastianRaschka  3 роки тому +4

      I need to add all the links to the PDFs to the video descriptions some time. For now, all the lecture slides can be found here: sebastianraschka.com/pdf/lecture-notes/stat451fs20/

  • @liammartin6793
    @liammartin6793 4 роки тому +2

    What is meant by a model been expensive ?

    • @SebastianRaschka
      @SebastianRaschka  4 роки тому +2

      Good question. Here, I meant that it is computationally expensive, i.e., it takes a long time to run and/or requires more computational resources than other simpler models.

  • @konstantinostzaferis5318
    @konstantinostzaferis5318 2 роки тому

    Sebastian I have a qustion.
    I am following this course while reading your book (Machine Learning with Pytorch ..).
    My question is this.
    In your book you code out a perceptron model using python.
    Do we need to know the code behind these algorithms, like the ID3 tree or the AdaBoost code?
    Do we need to go into the anaconda3 libraries and search for the algorithms and actually know the code behind them?
    Or we it is sufficient to only know how to call them from the scikit learn Library?
    I am asking because I suppose to be able to become a machine learning engineer you have to know the code behind the algorithms and actually be able to code them out yourself, or I am I completely wrong?