Learning Decision Tree

Поділитися
Вставка
  • Опубліковано 24 жов 2024

КОМЕНТАРІ • 24

  • @mandarmalekar6640
    @mandarmalekar6640 3 роки тому +6

    This is probably one of the best tutorials I have found on net. Thanks a lot Madam, continue the good work.

  • @ankurchanda1112
    @ankurchanda1112 5 місяців тому

    This is probably the best decision tree explanation I've come accross. Thank you madam.

  • @theacademician_cse
    @theacademician_cse 4 роки тому +3

    Respected Madam, I am learning lots of things from your video. Madam, if P+ = 1, and P- = 0, then Entropy(S) = NaN, since, 0*log2(0) is NaN (not a number), but not zero. It is asymptotic. Thank You.

    • @sudiptoghosh5740
      @sudiptoghosh5740 4 роки тому +4

      No. We find the limit of x->0 xlog2x . It can be found by LHospital's rule, and is equal to 0. It isn't the value of the function, it is the limit we consider at x->0.

  • @AbdulhakeemEideh
    @AbdulhakeemEideh 3 місяці тому

    Great Professor!!!

  • @VISHNUGUPTA-f1i
    @VISHNUGUPTA-f1i 8 днів тому

    Entropy is term taken from thermodynamics...

  • @getfitwithakhil
    @getfitwithakhil 6 років тому +2

    Greetings Dr. Sudesha Sarkar,
    At 21:25 why is the Entropy[29+, 35-] is calculated as -29/64log2 29/64.. ... why divide by 64 when the formula does not have divide by total samples.

    • @nabanitapaul7581
      @nabanitapaul7581 6 років тому +6

      In Entropy formula, p+ is the probability of positive sample, which will be (no. of positive sample)/total sample.
      Similarly, p- is the probability of negative sample

    • @getfitwithakhil
      @getfitwithakhil 6 років тому +3

      Ohh gotcha. Thank you very much.

  • @shivamtiwari8106
    @shivamtiwari8106 Місяць тому

    Nice lecture,

  • @saurabhshukla3080
    @saurabhshukla3080 6 років тому

    Nice series of tutorials.

  • @quagzlor
    @quagzlor 6 років тому +2

    thanks ma'am, have an exam tomorrow and this really helped

  • @zinalpatel8962
    @zinalpatel8962 6 років тому +2

    In the topic 'when to stop ' i can't understand 3rd reason.

    • @konakoteswararao7892
      @konakoteswararao7892 3 роки тому

      That means there is only one positive and negative point...you can choose from that you can choose any one of the feature

    • @digvijaymahamuni7722
      @digvijaymahamuni7722 3 роки тому

      3 reasons are 1) completely dominant 2) partially dominant 3) when we run out of attributes

    • @rahulpramanick2001
      @rahulpramanick2001 Рік тому

      @zinalpatel8962
      According to my interpretation the 3rd point means that when only few examples are falling under a split those may be the outliers or the noisy examples. So to avoid overfitting the decision tree we should avoid that split.

  • @amarsomani6596
    @amarsomani6596 5 років тому +1

    Where can I get the content or PPT?

    • @mctfellow
      @mctfellow 5 років тому +1

      Just enroll to the course in nptel
      ( Introduction to machine learning )

  • @Sandoverwater
    @Sandoverwater 6 років тому

    where is the slide ?

  • @vickythechamp
    @vickythechamp 5 років тому

    thank you

  • @tapanjeetroy8266
    @tapanjeetroy8266 6 років тому

    Thank you mam

  • @Uma7473
    @Uma7473 5 років тому

    thank u mam

  • @anumolukumar585
    @anumolukumar585 5 років тому +1

    not good