Decision Tree Hyperparameters : max_depth, min_samples_split, min_samples_leaf, max_features

Поділитися
Вставка
  • Опубліковано 29 вер 2024

КОМЕНТАРІ • 28

  • @Howtosolveprobem
    @Howtosolveprobem 4 роки тому +16

    nice video but if pictorially explained it would have been more interesting

  • @jamalnuman
    @jamalnuman 8 місяців тому

    very useful but better if you provide more figures and images to better explain what each parameter means

  • @anish_k
    @anish_k 4 роки тому +2

    I'm badly stuck here, please explain with the practical implementation

  • @jamalnuman
    @jamalnuman 8 місяців тому

    do you have similar presentation for the hyper parameters of random forest and xgboost?

  • @bhargav7476
    @bhargav7476 4 роки тому +1

    Does a highly skewed feature affect the AUC of Decision tree classifier? and do I have to remove outliers? I have a feature with 80% of values as 0 and the maximum is 13, I have tried log and sqrt transformation but it's still highly skewed

    • @texasfossilguy
      @texasfossilguy 2 роки тому +1

      try binning or reduce the skew by IQR

  • @ajinkyakoshti2411
    @ajinkyakoshti2411 4 роки тому +1

    Do a similar video with lgbm instead of decision trees wherein more hyperparameters come into the picture

  • @yasokavi
    @yasokavi 4 роки тому +2

    Its really helpful!

  • @suhailchougle7315
    @suhailchougle7315 4 роки тому +1

    Thank you so much for the video, really appreciate it

  • @nopenope5949
    @nopenope5949 2 роки тому

    What aspect that considered to know which hyper parameter we should use in our decision tree clasifier?

  • @alipaloda9571
    @alipaloda9571 4 роки тому

    love the information shared by you sir
    can you please make video on pre pruning and post pruning in R.F

  • @yscosta
    @yscosta 3 роки тому

    Hello!
    Do you have some advice about hyper parameters to use in regression randonforest?

  • @Ankitsharma-vo6sh
    @Ankitsharma-vo6sh 3 роки тому

    can u show it on a dataset

  • @kkamit0106
    @kkamit0106 2 роки тому

    Can you tell about CP values in Decision Tree?

  • @shahrukhahmad4127
    @shahrukhahmad4127 Рік тому

    Insightful video...thank you

  • @MrYURIBP
    @MrYURIBP 2 роки тому

    gracias bro.. saludos 🇨🇱

  • @murilopalomosebilla2999
    @murilopalomosebilla2999 3 роки тому

    Nice explanation!

  • @RajaSekharaReddyKaluri
    @RajaSekharaReddyKaluri 4 роки тому

    Nice one!!

  • @trinkesh8423
    @trinkesh8423 4 роки тому

    Great helpful!

  • @kurrucharan9376
    @kurrucharan9376 4 роки тому

    hey bhavesh !.....When max_features=10 what is the difference in selecting the best attribute/feature either from 10 or 50(total number of features).
    As both the features were selected based on the information gain or Gini gain.

    • @venkatvicky570
      @venkatvicky570 3 роки тому +1

      when the best feature is being selected from 50(total number of features), the same feature i.e. the topmost best feature out all the features will be selected to split the internal nodes every time when there needs to be a split in the sub trees .... whereas when max_features =10, it randomly chooses 10 features for each split and out of those 10 .. the best feature from those randomly selected 10 features will be used to split the internal nodes.