Random Forest Hyperparameter Tuning using GridSearchCV | Machine Learning Tutorial

Поділитися
Вставка
  • Опубліковано 3 лис 2024

КОМЕНТАРІ • 46

  • @KunaalNaik
    @KunaalNaik  2 роки тому +2

    Want to learn Data Science effectively and show confidence during interviews?
    Download the 6 -Step Strategy to master Data Science through now linear methods of learning
    Download Link - kunaalnaik.com/learn-data-science-youtube/

  • @gilangsatriya3027
    @gilangsatriya3027 2 роки тому +1

    Very informative!. It really helps me so much, i was wondering how to find the best parameter combinations when using hyperparameter tuning.

  • @VeryTubefullDay
    @VeryTubefullDay 4 роки тому +2

    Thanks for the simple explanation, it would be interesting to see this applied on a more exhaustive case though (things like further explanation which hyperparameters to use in grid search or why one uses Gini or Entropy etc.). Thanks anyway :)

    • @KunaalNaik
      @KunaalNaik  4 роки тому

      I am glad you liked the explanation. Thank you for taking the time to provide your feedback too :)
      Good idea. I am also figuring out what to make and not as I go. I will work on a follow up video and provide more details on it.

  • @lizzietsang6012
    @lizzietsang6012 2 роки тому +2

    Fantastic video - thanks!

  • @nataliatenoriomaia1635
    @nataliatenoriomaia1635 3 роки тому +2

    Great video. Very clear and objective. Thanks.

  • @elondoriafa9883
    @elondoriafa9883 3 роки тому +1

    I was searching more for explanation of the grid search however wonderful example of application, but if you could really break down what the concept of the grid search is that would help me out so much more!

    • @elondoriafa9883
      @elondoriafa9883 3 роки тому +1

      I know this is so vague, but for example.... I don't understand the hyper parameters you set and why they are important to the test, how do we read them... is this asking for too much perhaps?

    • @KunaalNaik
      @KunaalNaik  3 роки тому

      I will try in my next video :)

    • @KunaalNaik
      @KunaalNaik  3 роки тому +1

      Consider you are aiming at a Dart Board and the parameters required to hit the Bull Eye are (Your arm strength, your aiming skills, wind (imagine :P) and visibility. Now if each parameter were from a scale of 1-100, depending on your distance from the board you would have to calculate which combination works bets to hit the Bull's eye. You can start by trying a combination of (50, 60, 0, 40). But this combination did not hit the Bull's eye! Do you will keep trying all possible combinations till you hit it. This is what hyperparameter tuning is.

  • @beautyisinmind2163
    @beautyisinmind2163 2 роки тому +1

    I have a question here in order to tune the parameter how do we figure out the range for each hypermeters? is there any thumbs of rules to do so?

  • @aliciadevlinder
    @aliciadevlinder 3 роки тому +1

    Your title is gridsearch, yet you also explain basic model preprocessing!

  • @sandhya9544
    @sandhya9544 2 роки тому +1

    Super u have explained very well

  • @Sajib975
    @Sajib975 3 роки тому +2

    I have two classes of data, sample of A:B is 4:1, which method would be best for me?

    • @kunaal_coaching
      @kunaal_coaching 3 роки тому

      Use SMOTE to create a balanced class. They follow the process of modelling.

  • @jgubash100
    @jgubash100 3 роки тому +1

    Excellent

  • @finleyspencer2050
    @finleyspencer2050 9 місяців тому

    How come you didn't split your data into validation data for hyper param tuning?

  • @Niluhpuspaaa
    @Niluhpuspaaa 3 роки тому +1

    Hello sir, i can ask for u.... In get target data, y=data['target']
    "in tolerance is not none :
    Keyerror "target"

    • @KunaalNaik
      @KunaalNaik  3 роки тому

      Can you share the column names of your dataset and the code you wrote? I am unable to get the full picture of the error.

  • @chakree100
    @chakree100 4 роки тому +1

    Is the hyper-parameter tuning just a trail and error method we use it on every ML model using different searching technique to improve performance? or else is there any test to identify whether it is required or not.

    • @KunaalNaik
      @KunaalNaik  4 роки тому

      GridSearchCV goes through, every parameter you provide and finds the best combination that give the best accuracy. You are right.

  • @lord-qk3bx
    @lord-qk3bx 3 роки тому +2

    Thanks! I really thought tuning is a manual process :)

    • @KunaalNaik
      @KunaalNaik  3 роки тому

      Even I thought the same! But now I am much smarter :p

  • @abhinavbaruah
    @abhinavbaruah 4 роки тому +1

    I thought the video was about RandomizedSearchCV.Can you please show one for RandomizedSearchCV hyperparameter tuning?

    • @KunaalNaik
      @KunaalNaik  4 роки тому

      You are right. Let me change the heading and also make another with RandomizedSearchCV! Thank you for providing feedback!

  • @JameyTsey
    @JameyTsey 2 роки тому

    why you defined 'param_grid' but print 'random_grid' ?

  • @techtalks742
    @techtalks742 2 роки тому +2

    Print(random_grid) ------ is showing me name 'random_grid' is not defined ? Plz help

    • @KunaalNaik
      @KunaalNaik  2 роки тому +1

      Ensure you take the right feature name, or it might be because case sensitivity.

    • @techtalks742
      @techtalks742 2 роки тому

      @@KunaalNaik thnxz buddy

  • @ahmadharis6379
    @ahmadharis6379 3 роки тому +1

    Hey Sir, I wanna ask something, i tuned my multiouput regression model (randomforest) and i get the best_score_ parameter. Is it means average score or sum score from my multioutput model? Thanks in advance. Sorry my bad english.

    • @KunaalNaik
      @KunaalNaik  3 роки тому

      Out of all the model scores, you get the score of the best model. Within Random forest trees you get the Average of the scores.

  • @seelaswain8954
    @seelaswain8954 3 роки тому +1

    Why am I getting following error?
    AttributeError: 'tuple' object has no attribute 'best_params_'

    • @KunaalNaik
      @KunaalNaik  3 роки тому

      Can you please show the code you used to run? Through GitHub or Kaggle?

    • @seelaswain8954
      @seelaswain8954 3 роки тому +1

      @@KunaalNaik thank Kunal . I fixed it

    • @KunaalNaik
      @KunaalNaik  3 роки тому +1

      @@seelaswain8954 I am glad you got it!

  • @darmawandwiki-x2j
    @darmawandwiki-x2j Рік тому

    can you share the datasets ?

  • @srikanthv40
    @srikanthv40 3 роки тому

    why have you one ohe the data?

  • @manavgora
    @manavgora 9 місяців тому

    are Naik's born for ML🤣🤣

  • @Kornackifs
    @Kornackifs 11 місяців тому +1

    you didn't explain the concepts behind the code I think gpt is better than you

    • @KunaalNaik
      @KunaalNaik  11 місяців тому +1

      Yes, it is :) ChatGPT is always better to explain codes.