Random Forest in Machine Learning: Easy Explanation for Data Science Interviews

Поділитися
Вставка
  • Опубліковано 28 вер 2024

КОМЕНТАРІ • 25

  • @kp77256
    @kp77256 Місяць тому

    Very clear and informative

  • @AllieZhao
    @AllieZhao Рік тому

    Very clear and well structured

    • @emma_ding
      @emma_ding  Рік тому

      Thanks, Allie! Glad you found it helpful. 😊

  • @Doctor_monk
    @Doctor_monk Рік тому +3

    Would you mind sharing the notion page with us? Would really appreciate it. :)

    • @emma_ding
      @emma_ding  Рік тому +1

      Of course! I'm working on getting all notes organized and sharable in one location, will let you know as soon as they are ready! :)

  • @1386imran
    @1386imran Рік тому +1

    What happens if RF n_estimators(individual decision trees) have conflicting outcome as in 50% of them voted/predicted class A while the other 50% voted/predicted class B.
    In this situation, what would be the final outcome??

    • @davidskarbrevik
      @davidskarbrevik Рік тому +1

      Up to your logic at that point. But if that is a common occurrence in your model, perhaps try increasing the number of estimators.

  • @emma_ding
    @emma_ding  Рік тому

    Many of you have asked me to share my presentation notes, and now… I have them for you! Download all the PDFs of my Notion pages at www.emmading.com/get-all-my-free-resources. Enjoy!

  • @davidskarbrevik
    @davidskarbrevik Рік тому +1

    Can you clarify how the random feature subset selection happens "without replacement"? Is it that e.g. we have 20 features and tree 1 takes 10 features, tree 2 takes the remaining10 features and now tree 3 can take 10 from the original 20?

    • @paoloesquivel7430
      @paoloesquivel7430 6 місяців тому

      No. It means any tree in the forest has no duplicate features.

  • @raghu_teja4683
    @raghu_teja4683 Рік тому +1

    Nice lecture, can we get the resource you used. It will be very helpful.

  • @南南東-s3v
    @南南東-s3v Рік тому +1

    Very clear explaination! Thank you so much!

  • @shawnkim6287
    @shawnkim6287 Рік тому

    Hi Emma. Thanks for the video. Have a question. I am not sure about how this statement is true. "random forest constructs a large number of trees with random bootstrap samples from the training data". If sample size = replacement, we have all observations in every bootstrap sample. Then, it's not random bootstrap samples. Can you please elaborate what that line is saying?

  • @yuegao5575
    @yuegao5575 Рік тому

    Great Video! Thanks for making it. One minor comment is that at 6:56, sigma^2/k is actually not from CLT, essentially it's just from the basic property of variance.

  • @evag3014
    @evag3014 Рік тому

    Looking forward to the notes!! Thanks for sharing, Emma!!!

  • @shubhamkaushik285
    @shubhamkaushik285 Рік тому

    can we say if interview ask which algorithm can be used here , and we don't know the Ans we can surely apply random forest here.🤔😜

  • @ayuumi7926
    @ayuumi7926 Рік тому

    A very helpful video on RF. Hi Emma, would you mind actually making a video on how to go about mastering new ML concepts from zero to hero?

    • @emma_ding
      @emma_ding  Рік тому

      Thanks for the suggestion, Ayuumi! I'll add it to my list of video ideas. 😊

  • @shilpamandal7232
    @shilpamandal7232 Рік тому

    Awesome video. Super helpful.

  • @emmafan713
    @emmafan713 Рік тому

    thanks!!1

  • @tinbluu7653
    @tinbluu7653 Рік тому

    Love it!

  • @yungetong634
    @yungetong634 Рік тому

    great video!

    • @emma_ding
      @emma_ding  Рік тому

      Thanks for the kind comment, Yunge! 😊

  • @alanzhu7538
    @alanzhu7538 Рік тому

    Keep up the awesome work!, Emma I watched your video one year ago and I got a data science job. Now I start to forget some ML models that I don't use often, it is a very good way to refresh my memory on them!!!