Machine learning - Decision trees

Поділитися
Вставка
  • Опубліковано 31 жов 2024

КОМЕНТАРІ •

  • @prajwalshenoy9117
    @prajwalshenoy9117 5 років тому +1

    Tremendous Explanation! This is what even courses should focus on. Instead of just giving details on the surface and start importing packages and implementing for viewer's satisfaction, it is more fruitful to start from the scratch, dig the mathematics and intuition behind and appreciate the concept.

  • @Technoslerphile
    @Technoslerphile 10 років тому +11

    Excellent! This is how a teacher should teach.

  • @michaelturniansky7959
    @michaelturniansky7959 10 років тому +1

    Thank you very much for this and the following session's lecture. I got my CS degree 25 years ago, and it's nice to learn about things like how to automatically decide which questions to ask first.

  • @nitinat3590
    @nitinat3590 9 років тому +1

    Superb lecture..Thank you very much for sharing it..I was struggling with the subject before watching this video, but now am quite comfortable and i think ill be able to manage using decision trees in my project.. Thank you again :)

  • @newbie8051
    @newbie8051 10 місяців тому

    It amazes me that people were discussing these topics when I was studying about the water-cycle lol.

  • @SahibzadaIrfanUllahNaqshbandi
    @SahibzadaIrfanUllahNaqshbandi 8 років тому

    thank you very much..it really helped sir....and one thing I wanna tell that you have got a sweet voice.

  • @GatoNordico
    @GatoNordico 11 років тому

    Nice lecture! I came here for Decision Trees but I think I'll have a look at your other videos as well

  • @thungp
    @thungp 8 років тому +5

    When I did the calculation for I(Patrons) at time roughly 46:36 for the number of bits of information, I get .541 (not .0541) as in his slide deck. Also, I had to find from a difference refernce that when you have a Log(0), which is normally undefined, they assume it is 0.

    • @ashimgupta9538
      @ashimgupta9538 8 років тому +3

      I think they don't assume log(0) to be zero but 0*log(0) to be zero.

    • @ayushrastogi6089
      @ayushrastogi6089 6 років тому

      yes it is 0*log(0), but also all log calculations are with base 2.

  • @snehotoshbanerjee1938
    @snehotoshbanerjee1938 9 років тому

    Best Lecture on Decision Tree.Which measure is the best - Entropy or Gini?

  • @oreoluwa24
    @oreoluwa24 11 років тому

    Your Lectures are very explanatory; even as an undergrad I understood them. Thanks! I was wondering if you covered multivariate decision trees in any of your lectures.

  • @alhoqani2750
    @alhoqani2750 9 років тому +2

    great lecture, I have a question, is there any session for building decision tree manually?

  • @zxxNikoxxz
    @zxxNikoxxz 9 років тому +13

    I suppose this is how Arkinator guess who you are thinking of.

  • @kevinsluder3711
    @kevinsluder3711 8 років тому

    Excellent! Can subsequent levels in the tree use the same attribute for the decision at a node? For instance in the 4 color, 2 dimension example, if the root level split is based on x-sub-i, can the next level node use a rule based on x-sub-i (obviously a different split)?

  • @yuanyuan3056
    @yuanyuan3056 7 років тому

    The most clear ML course I had

  • @chandrabhatt
    @chandrabhatt 10 років тому

    Great lecture. crystal clear!

  • @TheHarperad
    @TheHarperad 11 років тому +3

    "To understand what a forest is we first need to understand the tree" :D

  • @ZestyCrunchy
    @ZestyCrunchy 11 років тому +1

    Over 200kg? That's a whale! Awesome lecture by the way :)

  • @mohammadkamruddin6399
    @mohammadkamruddin6399 9 років тому

    Good lecture on decision tree. Can you please share Antonio Criminisi technical report link here.
    Thank you.

    • @saadorj
      @saadorj 9 років тому +2

      +Mohammad Kamruddin Google this:
      "Decision Forests for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning"

  • @0S0L0
    @0S0L0 11 років тому

    Hey Ore! Did you find any lecture on multivariate decision trees?

  • @ayushrastogi6089
    @ayushrastogi6089 6 років тому

    can you provide the link for the report by antonio criminisi referred by you at 52:50

  • @GoyalMrManish
    @GoyalMrManish 8 років тому

    Nice Explanation to decision tree :)

  • @rahulchandra759
    @rahulchandra759 7 років тому

    Does anyone know where is the data file available or we just type it in from the slide Prof has

  • @sonilshrivastava1428
    @sonilshrivastava1428 9 років тому

    nice lecture.Thankyou very much sir..Can anybody share the referenced 'Criminisi et al, 2011' paper link?

  • @ayushrastogi6089
    @ayushrastogi6089 6 років тому

    All log calculations for entropy are with base 2 ??

  • @whiteshadow3000
    @whiteshadow3000 8 років тому

    22:08 square yards?
    awesome lectures by this teacher btw

  • @CvDb-mt4dl
    @CvDb-mt4dl 9 років тому

    Thank you so much..!!!

  • @olegstolyar6127
    @olegstolyar6127 10 років тому

    Thank you.

  • @TheHarperad
    @TheHarperad 11 років тому +2

    "If you go to the left, you are 100% red"

  • @peterv.276
    @peterv.276 8 років тому

    ist es erlaubt, das video auf z.b. sozialen Plattformen zu teilen?

    • @chuckiechuckster349
      @chuckiechuckster349 8 років тому

      Das Video befindet sich auf UA-cam. Solange nur ein HTTP Verweis (URL) benutzt wird, ja natürlich.

  • @jobsamuel
    @jobsamuel 9 років тому

    Could you help me with the calculations at 48:23? I haven't figured it out why I(Patrons) is equal to 0,541 bits :(

    • @woowooNeedsFaith
      @woowooNeedsFaith 9 років тому

      Jobsamuel Núñez Remember to use logarithm base 2. Most calculators use natural logarithm by default.

    • @tobiaspahlberg1506
      @tobiaspahlberg1506 9 років тому

      +Jobsamuel Núñez Only the last term within the brackets contributes because 0*log2(x) = 0 and 1*log2(1) = 0. The expression simplifies to 1 - [6/12 * (-2/6*log2(2/6) - 4/6*log2(4/6))] = 0.5409....

    • @zwep
      @zwep 8 років тому

      +Tobias Pahlberg Exactly, so that means that there still is a typo in the lecture, right? Since he states 0.0541..
      edit: wooohps, nevermind

    • @tobiaspahlberg1506
      @tobiaspahlberg1506 8 років тому

      zwep Yes, but I think someone in the audience pointed that out later

  • @余跃-h5s
    @余跃-h5s 6 років тому

    Excellent

  • @mercurichinc
    @mercurichinc 10 років тому

    great shareing,Thank you.

  • @deepakk1944
    @deepakk1944 6 років тому

    Thanks

  • @KrishnaDN
    @KrishnaDN 8 років тому

    fantastic............:)

  • @Rokel1993
    @Rokel1993 7 років тому

    this is excelent but i want to learn m5 model tree any one help me how to learn any linkgive me

  • @funkyweezy8071
    @funkyweezy8071 9 років тому

    Patron is pronounced "pay-tren" :)

  • @ulmermanfred4
    @ulmermanfred4 6 років тому

    Did I here a freudin slip? He said arround 22:55 in a greece a greedy fashion. :-). Greece is not greedy but media make us believe?

  • @lynnwilliam
    @lynnwilliam 8 років тому

    its hard to make money in AI. No restaurant or builder can afford to hire someone to do AI.
    Only a small fraction of AI developers get a job, sadly AI is not really used everywhere.

  • @ThePunisher005
    @ThePunisher005 5 років тому

    So boring lecturer, I would drop the course if he is teaching