How Random Forest Performs So Well? Bias Variance Trade-Off in Random Forest

Поділитися
Вставка
  • Опубліковано 19 лип 2021
  • Understand the reason behind Random Forest's strong performance. This video explains the concept of Bias-Variance Trade-Off in Random Forest in simple terms, revealing why it's a powerful technique in machine learning.
    Code used: github.com/campusx-official/1...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in

КОМЕНТАРІ • 18

  • @sameerabanu3115
    @sameerabanu3115 8 місяців тому +5

    This guy has beaten all the ML videos on UA-cam, great piece of work

  • @arun5351
    @arun5351 3 роки тому +10

    Great job mate!
    Can you please start time series and it's algorithms. In many of the interviews I attended, the client expectation is that time series is a fundamental concept and one should be adroit in time series problems. Surprisingly, very few trainers on UA-cam are exploring this topic and that too just on the surface.
    I request you to please start time series lectures as it's very intuitive topic and you're exceptional in explaining complex topics.
    Regards

  • @because2022
    @because2022 2 місяці тому +1

    Great explanation. The way you use visualization is awesome. Keep doing the same in plethora of topics.

  • @deveshnandan323
    @deveshnandan323 Рік тому +2

    Next Level Video , Thanks a Lot :)

  • @mdjidmi8822
    @mdjidmi8822 3 роки тому +2

    Love you brother from bangladesh

  • @bangarrajumuppidu8354
    @bangarrajumuppidu8354 2 роки тому +2

    excellent intution

  • @barunkaushik7015
    @barunkaushik7015 2 роки тому +1

    Awesome

  • @sandipansarkar9211
    @sandipansarkar9211 Рік тому +1

    finished watching

  • @balrajprajesh6473
    @balrajprajesh6473 Рік тому +1

    best!

  • @akash.deblanq
    @akash.deblanq 3 роки тому +4

    Will you continue making these videos after 100th day? As a lot of classical algorithms are still left to cover I think. :)

    • @campusx-official
      @campusx-official  3 роки тому +6

      Yes all the algos will be covered

    • @sachin2725
      @sachin2725 Рік тому +3

      @@campusx-official
      Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?

  • @ashutoshverma1418
    @ashutoshverma1418 Рік тому

    But our variance comes from the test data? How is it impacted by removing the impact of outliers/noisy points from the training data?
    How random forest reduces variance is not clear to me, if someone could help.

    • @kamlakarsapkale319
      @kamlakarsapkale319 4 місяці тому

      Because Weights are already calcuated for features during training

  • @greeshmavs6437
    @greeshmavs6437 Рік тому

    sir your video are excellent, but instead of doing in hindi can you please do it in English, it will help us lot for not understanding hindi language ,it my request

  • @sandipansarkar9211
    @sandipansarkar9211 Рік тому

    finished coding