3 Decision Tree | ID3 Algorithm | Solved Numerical Example by Mahesh Huddar

Поділитися
Вставка
  • Опубліковано 28 сер 2024
  • 3 Decision Tree - ID3 Algorithm Solved Numerical Example by Mahesh Huddar
    Machine Learning Tutorial - • Machine Learning
    Big Data Analysis Tutorial - • Big Data Analytics
    Data Science and Machine Learning Tutorial - • Machine Learning
    Python Tutorial - • Python Application Pro...
    id3 algorithm decision tree,
    id3 algorithm in machine learning,
    decision tree in ML,
    decision tree solved example,
    decision tree numerical example solved,
    id3 algorithm in data mining,
    id3 algorithm decision tree in data mining,
    id3 algorithm decision tree python,
    id3 algorithm decision tree in machine learning,
    id3 algorithm example,
    id3 algorithm in data mining with an example,
    id3 in data mining,
    decision tree problem,
    decision tree problem in big data analytics,
    decision tree machine learning,
    decision tree,
    decision tree in data mining,
    decision tree analysis,
    decision tree by Mahesh Huddar,

КОМЕНТАРІ • 53

  • @techfort
    @techfort 3 роки тому +70

    Dear Mahesh sir,
    If I'm not wrong then At 10:30 the Entropy (S) must be 0.7219 .
    And Gain(S,A3) answer would be 0.7219-0-0=0.7219.
    Thank you

    • @smita_r
      @smita_r 2 роки тому +4

      Thanks for clearing the dought here

    • @pratik2617
      @pratik2617 Рік тому +3

      @@smita_r doubt hota ha🙂🙂

    • @ohiogozaimasu
      @ohiogozaimasu 3 місяці тому

      Haa, entropy of parent sir ne 0.7219 calculate kri h, bs misprint ho gya h last step m

  • @padelis_doulis
    @padelis_doulis 3 роки тому +92

    you have a mistake at Gain(s,a1) at 3:50, it should be (-5/10) * 0 and not (05/10)*1 since E(Sfalse)=0.0(result is right tho, you just misspelled). Other than that, nicely explained, learnt how this works thanks to your vids.

  • @VamshiChaithanya
    @VamshiChaithanya 3 роки тому +66

    At 9:46, Entropy(S) should be 0.7219 right ? SO the answer must be 0.7219-0.6486=0.07294 . Kindly someone correct me if wrong

  • @ramyasrikanteswara5366
    @ramyasrikanteswara5366 3 роки тому +33

    Entropy values of false in a1 is zero but while calculating final value you have taken it as 1

  • @ayushmanbhargabagopalbiswas
    @ayushmanbhargabagopalbiswas Рік тому +2

    Love you sir, it's help in our semester exam❤

  • @manishasamal9474
    @manishasamal9474 3 роки тому +3

    Very nicely explained 👍👍

  • @emmanuellafakonyui9550
    @emmanuellafakonyui9550 Рік тому +1

    Thank you so much for this video. I really do appreciate. It's amazing how explicit you made it and very understandable too. I was able to finish my assignment in a short interval (zero stress)

  • @quangminhtran8562
    @quangminhtran8562 Рік тому

    Tks sir, It''s help me so much in my semester exam

  • @kushalsharma2018
    @kushalsharma2018 Рік тому +4

    The guys confused on how to put log base 2 in the calc. In most calcs u can not directly put log base 2 but instead u can find it out by for example: Find log base 2 of 5, it can be done by: (log 5 )/ (log2) where the log is in base 10

    • @evo-star7850
      @evo-star7850 6 місяців тому +1

      This actually quite a good tip.

    • @darketernal5792
      @darketernal5792 5 місяців тому +1

      damn!! you saved me man 👊

  • @johnrooney9972
    @johnrooney9972 3 роки тому +4

    is it okay to leave a2 out of the tree?

  • @self.__osman
    @self.__osman 2 роки тому

    God bless you sir

    • @MaheshHuddar
      @MaheshHuddar  2 роки тому +2

      Thank You
      Do like share and subscribe

  • @junaidiqbal5018
    @junaidiqbal5018 3 роки тому

    nicely explained pal

  • @leonardkiptala9734
    @leonardkiptala9734 2 роки тому

    Thank you so much.

    • @MaheshHuddar
      @MaheshHuddar  2 роки тому +1

      Welcome
      Do like share and subscribe

  • @lenaleo3322
    @lenaleo3322 Рік тому +1

    thankyou for sharing this vdo

    • @MaheshHuddar
      @MaheshHuddar  Рік тому

      Most welcome 😊
      Do like share and subscribe

  • @dakshitkatakam
    @dakshitkatakam Місяць тому

    how to calculate entrophy value

  • @yourbrother9941
    @yourbrother9941 2 роки тому

    If there is equal gain, which one we have to consider for next node??

  • @santoshvadisala5324
    @santoshvadisala5324 2 роки тому

    Correct me if I am wrong, but I wonder where is the numerical data in this dataset? every feature is categorical rt ?

  • @user-sz2ue6sv2h
    @user-sz2ue6sv2h 10 місяців тому

    please make video on C4.5

  • @hawkitnow8135
    @hawkitnow8135 2 роки тому

    Thnx

    • @MaheshHuddar
      @MaheshHuddar  2 роки тому

      Welcome
      Do like share and subscribe

  • @miklospoth1950
    @miklospoth1950 3 роки тому

    Instances 1 and 2 are the same, also 4 and 5 are the same. Shouldn't we merge them into one single instance?

  • @thegeethaart
    @thegeethaart 3 роки тому

    Sir plz can you solve even last year dec2019/jan2020 paper problems

  • @minalpatil5671
    @minalpatil5671 3 роки тому

    Is it ok to leave out a2?

  • @smithtuscano4576
    @smithtuscano4576 Рік тому

    What is log2

  • @vinayg2133
    @vinayg2133 3 роки тому +1

    Sir can u just say wic entropy value must be consider after dividing table in above example should we consider entropy(true) r intial entropy

    • @padelis_doulis
      @padelis_doulis 3 роки тому +4

      after you devide table, because you only care about the left part of the tree (True), since the right part doesnt need any more questions asked, you consider the Entropy(true) to check for 1 2 6 7 8. So your S there, should be the same as Entropy(True). and then you check a2 and a3 as the video shows.

  • @alimuddinkhan25
    @alimuddinkhan25 3 роки тому

    finally a1 -sir you are put false=1 this wrong
    please sir correction false = 0

  • @sravanr2275
    @sravanr2275 Рік тому +1

    Today I got this question only in the exam but I seen yesterday and Today only that 23 minutes video that's it 😕🫤🙁☹️😪😭😪🙂🙃🥲🫠🥹🫥😑😐😬😶😶‍🌫️🤐🥺😔🫡😞😓😟😥😢😒🙄😮‍💨😤

  • @hawkitnow8135
    @hawkitnow8135 2 роки тому

    Thnx

    • @MaheshHuddar
      @MaheshHuddar  2 роки тому

      Welcome
      Do like share and subscribe

    • @hawkitnow8135
      @hawkitnow8135 2 роки тому

      @@MaheshHuddar i want some examples on perceptron and margins , cant find it , would u help me plz