Decision Tree 1: how it works

Поділитися
Вставка

КОМЕНТАРІ • 189

  • @PolinaInsane
    @PolinaInsane 8 років тому +66

    The best example of decision tree generation algorithm I've seen so far, thank you

  • @kroyedapper
    @kroyedapper 4 роки тому +1

    You are good. 6 years after production and I am utmost grateful for this gift

  • @dissdad8744
    @dissdad8744 8 років тому +17

    Great didactics. Your explained it in really comprehensible way, without missing the point, something that very few lecturers can actually do.

  • @shashidharthalakatti7363
    @shashidharthalakatti7363 6 років тому +1

    Respected Sir,
    I am working as Assistant Professor in Engineering College for the department of Computer Science and Engineering, As I was unable to understand this concept by reading number of books,since from one year, but your video of 9.25 minutes has made very comfortable understanding of this concept ID3 decision tree.
    So once again I am very grateful for you, for sharing your knowledge and videos at the free of cost sir. I am hats off to you and your University sir.

  • @andrascser7235
    @andrascser7235 10 років тому

    By far the best and easiest to understand description of decision tree based machine learning on the internet. Thank you Victor for sharing!

    • @vlavrenko
      @vlavrenko  10 років тому

      Thanks! Happy you found it helpful.

  • @marthagrey6054
    @marthagrey6054 4 роки тому

    This playlist is by far the best (and very thorough) explanation of decision trees I've come across. Thank you so much for posting this!

  • @raghavvohra8867
    @raghavvohra8867 6 років тому

    This was not only the best example but also had the best explanation, thank u sir!

  • @vivekvikramsingh7685
    @vivekvikramsingh7685 9 років тому

    This lecture simply follows Occam's razor.
    Lecture is simple , and that is why it is too good.
    Simply tells the concepts.
    You helped me.

  • @msdrebs
    @msdrebs 6 років тому +1

    This was so so SO helpful. SImple and clear to understand. THANK YOU!

  • @sarashakibi5429
    @sarashakibi5429 7 років тому

    Professor Lavrenko
    I deeply appreciate your great explanation and simple example.
    thank you

  • @Alex56381
    @Alex56381 4 роки тому

    So much better explained then in my books for the university, thank a lot mate!!!

  • @Josehish
    @Josehish 5 років тому +2

    2019 and from all the videos I have watched that discuss this topic, this clip still gets it done the best way. Thank you.

  • @choronator
    @choronator 8 років тому +4

    SO BLOODY SIMPLE WHEN YOU EXPLAIN IT!
    cheers

  • @devdasmalekar2726
    @devdasmalekar2726 9 років тому

    Thank you very much.. I read few papers and more I read, more confusing it went. You made it crystal clear with 'real life example explained it very effective way'. Thanks again.

  • @yoloop93
    @yoloop93 9 років тому

    You just explained what my assistant couldn't in just 10 min! Thanks a lot!

  • @nijatmursali9943
    @nijatmursali9943 5 років тому

    You made Decision Trees look so easy to solve. Thank you so much, sir.

  • @adesewaadegoke739
    @adesewaadegoke739 9 років тому

    wow..your explanations are so helpful. I have just gone a few minutes into your lecture and i'm excited! I know i will be here for a while.Will learn everything I can. Thanks again.

  • @hichamtribak1944
    @hichamtribak1944 7 років тому

    the best and the most helpful professor i've ever seen. Thanks a lot Pr @Victor Lavrenko

  • @basmairtahi3225
    @basmairtahi3225 5 років тому

    That such an easy way to understand .. Even easier than our Dr. who I can't almost understand anything from her! many thanks

  • @ekipapropast6001
    @ekipapropast6001 8 років тому

    It is so easy to understand just with one great example. Great job!

  • @sadihassan8407
    @sadihassan8407 8 років тому

    I am learning from your classes very easily! Thank you very much dude!

  • @KashyapMaheshwari
    @KashyapMaheshwari 6 років тому +4

    Thank you for making this concept so simple. Truly amazing :)

  • @prometeo34
    @prometeo34 9 років тому +4

    Excellent explanation Victor, and the example very appropriated...

  • @NguyenSon-tm1lu
    @NguyenSon-tm1lu 8 років тому

    Thanks, Victor. You made a perfect sense for Decision Tree concept

  • @SayakBanerjee
    @SayakBanerjee 7 років тому

    Thanks alot Sir. I wanted to learn decision trees. And this is just what i wanted. your way of teaching is superb..!!

  • @pasha7293
    @pasha7293 5 років тому

    Thank you Victor for such a great example and explanation

  • @mohsenmazandarani7506
    @mohsenmazandarani7506 5 років тому +1

    Clear and beautiful explanation.....

  • @bylandab
    @bylandab 9 років тому +18

    Great lecture, however your volume is very, very low. Thanks!

  • @Bulgogi_Haxen
    @Bulgogi_Haxen Рік тому

    Studying at TUM. I admire german students who are following the lecture contents from the uni.
    Taking ML course atm, but here, the lecture is just like dumping only the whole concepts, regardless of whether students can understand them or not...
    So nice explanations in every video in ML related playlist..
    I fcking regret that I did not choose the UK to study my master's.

  • @jibrilqarni
    @jibrilqarni 8 років тому +5

    wait, the new data is Rain, High and Weak, isnt the class is YES ? cause in D4 said same too

  • @mateeshbhave7988
    @mateeshbhave7988 8 років тому +7

    What if one of the nodes never forms a pure subset? How do we make decisions then?
    Also, what if there are not a single pure subset in the decision tree?

    • @baldeaguirre
      @baldeaguirre 5 років тому

      maybe you should check another algorithm..

  • @DrStrangeLove2050
    @DrStrangeLove2050 7 років тому +12

    Why did you decide to split it on "outlook"??
    Why not start splitting them on "Humidity" or "Wind"??

    • @ehtishamali6027
      @ehtishamali6027 7 років тому +1

      We decide my measuring the purity of split. Purity can be measured by calculating Entropy. Low entropy means low uncertainty and it leads to high purity. Entropy value 0 means high purity and Entropy value 1 means smallest purity. So we make our decision which attribute to split on by calculating Entropy.

    • @Exadeful
      @Exadeful 7 років тому

      But the entropy for outlook is 1.578, while it is 1 for humidity, and 0.985 for wind. Wouln't a split on outlook be the worst one to pick?

    • @benjamingarrard9821
      @benjamingarrard9821 6 років тому +3

      entropy can't be higher than 1, so I think your math is off. An entropy of 1 would mean that the data is half true and half false. If it was 0 then that would mean that the entire set or subset would be the same, all true or all false, it would be pure.

    • @Alexandr-fv8me
      @Alexandr-fv8me 6 років тому

      In the provided example variable Outlook has three possible values: {sunny, overcast, rain}. In simple case when each of those values has Pr = 1/3, the entropy of this distribution will be >1. So, yes, it can, even though I didn't check his math.

    • @anjaybruss
      @anjaybruss 6 років тому

      @WAI HO YAU www.saedsayad.com/decision_tree.htm find information here :)

  • @hirakmondal6174
    @hirakmondal6174 5 років тому +1

    What if the humidity had been High on a Sunny day in which John played?
    or
    What if the wind was weak on a Rainy day in which john has not played?

  • @divinitytarot6
    @divinitytarot6 5 років тому +1

    THANKS VICTOR , REALLY APPLAUDED

  • @mattmiller229
    @mattmiller229 6 років тому

    I love this explanation!!
    I have just one comment:
    At min 8:00
    ".... and in all other cases John will not play.."
    This statement ist wrong, because if you have attributes that are not recorded yet, you can make no decision about how John will play or not :-)
    But this statement is true, if we assume that the dataset is correct and complete and is only used for all decision.
    I just want to make clear, that if you look at attributes that are not available, for example, if John plays when he is sick or not, you can make no founded decision.

  • @نوافرشيد-م8د
    @نوافرشيد-م8د 9 років тому

    thank you, that was a Great explanation .
    D4 is similar to D15, which makes it easy to predict that Jone will play in D15

  • @mohammedalshen3147
    @mohammedalshen3147 5 років тому +1

    Hello, thank you for simple explanation, however, i have a doubt from(6.20), when we consider the candidates for sub nodes of "rain" should we consider both "Humidity" and "wind"? or just "wind" ?. pls explain

  • @ramshanmuga
    @ramshanmuga 9 років тому

    Thanks for a great lecture. Very clear, entertaining and informative!

  • @sanjaykrish8719
    @sanjaykrish8719 6 років тому

    Excellent Victor. You are great.

  • @shakesbeer5171
    @shakesbeer5171 5 років тому

    Amazing introduction. Thanks a lot for publishing!

  • @PS3lovrsaywat
    @PS3lovrsaywat 7 років тому +11

    What happens if a pure subset is not possible from the given data?

    • @imreallygoodatlife
      @imreallygoodatlife 7 років тому

      i was hoping to hear the answer to this for the whole video...

    • @tomascerkauskas6957
      @tomascerkauskas6957 7 років тому +10

      I guess if you get to subset of one item, it will be pure.

    • @SAAZANRAI
      @SAAZANRAI 6 років тому

      any update?

    • @SebastianMantey
      @SebastianMantey 6 років тому

      I recently did a video on decision trees and I think I am referring to your question in this video ua-cam.com/video/WlGuizdVaiY/v-deo.html at 11:35.
      Also, I created a video series on how to code a decision tree from scratch in python. Maybe, that’s also of interest to you. Anyway, here is the link: ua-cam.com/play/PLPOTBrypY74xS3WD0G_uzqPjCQfU6IRK-.html

    • @KnightParagon
      @KnightParagon 6 років тому +1

      A pure subset will always be possible, if you have enough branches in your tree. Like someone already commented, a subset of one item is pure. But, a decision tree that has too many leaves is not a good model. If you limited the example tree in the video to a depth of 0 (in other words, nothing deeper than the root node), then you would split a new data entry based on the weather, and then stop and decide if John will play, yes or no. If it's overcast, the tree will decide he played. If it's sunny, the tree will decide he didn't play, and if it rained then the tree will decide that he did play. But remember how the guy in the video said that each node keeps track of the count of each class at each node? The amount of confidence the tree has in each of those decisions won't be the same. Sunny days have 2 Yeses and 3 Nos, right? So the most likely class, if you don't split the data again, is NO, he won't play. But the confidence will be low because it's a very impure subset. And vice versa for rain (3 Yeses, 2 Nos). Kinda funny that the tree works out this way since you would think one would be more likely to play on a sunny day, and less likely on a rainy day lol

  • @ramchandersrivastava6352
    @ramchandersrivastava6352 6 років тому

    I think you did a grt job, making it very simple.

  • @himanshubhusanrath212
    @himanshubhusanrath212 4 роки тому

    Hi Victor, It was a wonderful explanation

  • @fernandonakamuta1502
    @fernandonakamuta1502 6 років тому

    Very good example and explanation!
    Thanks!

  • @sudheerrao07
    @sudheerrao07 10 років тому

    Very well explained. The example you have chosen is awesome. Merci

    • @vlavrenko
      @vlavrenko  10 років тому

      Sudheer Rao Thank you! Happy to know you found it useful.

  • @oksanasakhniuk617
    @oksanasakhniuk617 6 років тому

    Very informative and easy to follow. Thank you

  • @chandrabhatt
    @chandrabhatt 10 років тому +9

    Why is John so predictable? :)
    Good example and well explained. Thanks!

  • @finess3
    @finess3 4 роки тому

    Great video
    Any suggestions on how to analytically make career desicion that does not possess statistical probability
    My thought was a payoff table but I don't know how to do it without percentages
    Suggestions?

  • @tadessekebede7708
    @tadessekebede7708 7 років тому

    Thank you. it helps me a lot. I am using WEKA, but the tree size that I get during analysis is 1. how I can solve it? please help me.

  • @ArundhatiGhosh2012
    @ArundhatiGhosh2012 9 років тому

    I agree with others that this is by far one of the easiest, yet pretty informational video lecture on Decision Tree. I know there are numerous resources to refer, but what would be your recommendation if you were to read a little more in-depth about the topic and do some hands on practice?

  • @saralabai2173
    @saralabai2173 7 років тому

    Thanks, It was useful for me,
    I could understand the basic general concept .

  • @AayushRampal
    @AayushRampal 8 років тому

    All of your videos are great, i could have never understood these concepts from books..
    Do you have any complete playlist covering all topics of Machine Learning?

  • @SayanPaulcodes
    @SayanPaulcodes 8 років тому

    So simply explained !

  • @vlavrenko
    @vlavrenko  10 років тому +20

    Good thing he is, else this would have made for a very confusing example :)
    Thank you!

    • @jonnyopera
      @jonnyopera 10 років тому +5

      I don't think he would play in Rain w/ High Humidity. The sequence in which you chose to evaluate the attribute of "Wind" was random...had you chosen Rain-->Humidity--> I don't think he would have played :-)

    • @nicholasflores9871
      @nicholasflores9871 8 років тому +1

      What book are you using for the class?

    • @kmanjunath17
      @kmanjunath17 7 років тому

      Victor Lavrenko gugtytrfttffgj

  • @panitanwongse-ammat598
    @panitanwongse-ammat598 8 років тому +3

    Professor Lavrenko,
    I would be deeply appreciated if you could post a link to download your slide?
    Best,
    Top

  • @vinay9023
    @vinay9023 8 років тому

    The textbook being referenced to here in these videos is a publicly available one?

  • @zaquesvb00x
    @zaquesvb00x 9 років тому +2

    Really helpful. Understood it pretty easily. Thank you a lot, Victor. :)

    • @vlavrenko
      @vlavrenko  9 років тому

      You're welcome. Very happy it helped you.

  • @krishnakc25
    @krishnakc25 7 років тому

    How did you calculate the Entropy to select the root node?

  • @mandeepbaluja5401
    @mandeepbaluja5401 5 років тому

    why for Rain wind is taken and humidity is not taken ?

  • @chlar7582
    @chlar7582 8 років тому

    What if I pair 'sunny' with 'wind'? What would happen if there's no pure subset in the tree?

  • @NikkieBiteMe
    @NikkieBiteMe 6 років тому +1

    That's a really nice explanation. Thank youuuu lots!!!

  • @snrb3196
    @snrb3196 9 років тому

    Thank you sir for your great lecture.

  • @nadavkedem4649
    @nadavkedem4649 5 років тому

    To which book do you refer?

  • @ismailal-mohammad4395
    @ismailal-mohammad4395 5 років тому

    Very clear and helpful

  • @chieze5406
    @chieze5406 7 років тому

    Is there a way I can contact you directly as I have a question on Decision trees not covered in this video?

  • @HadrienDykiel2000
    @HadrienDykiel2000 8 років тому

    Top notch explanation

  • @עדישרייבר
    @עדישרייבר 9 років тому

    great seasion . clear and great example!

  • @lokeshs5962
    @lokeshs5962 6 років тому

    please make sure you have good audio quality

  • @balabodhiy730
    @balabodhiy730 6 років тому

    why do you choose the "outlook" as the first attribute, why not others??

  • @christianm6268
    @christianm6268 7 років тому +1

    what if i have dataset containing more than 500 data? is there a tool that could help me or do it manually just like u did?

    • @newday8074
      @newday8074 7 років тому

      I guess there are many tools, among them rapidminer and openoffice softwares, you can look at this document, it is very helpful and practical docs.rapidminer.com/downloads/DataMiningForTheMasses.pdf

  • @vikramjadhav4180
    @vikramjadhav4180 10 років тому +1

    Thanks Victor Lavrenko
    Is there any book for reference for Data mining?

    • @vlavrenko
      @vlavrenko  10 років тому

      Vikram Jadhav We use the following textbook at Edinburgh:
      www.amazon.com/Data-Mining-Practical-Techniques-Management/dp/0123748569
      But there are better textbooks by Chris Bishop, David Barber, and Hastie et al.

  • @hamzasaad8431
    @hamzasaad8431 6 років тому

    Thanks for great explaining

  • @gplus46
    @gplus46 7 років тому

    Wow, you made it so easy to understand the decision tree. But I don't understand how someone would play tennis in strong wind:)

  • @jianishen5656
    @jianishen5656 8 років тому

    very clear to me , thank you

  • @yanyanwu0358
    @yanyanwu0358 7 років тому

    Thank you, great simple example!

  • @thushalbk8268
    @thushalbk8268 4 роки тому

    Indian guy asking a doubt which is going to be explained in the coming slides is the most Indian thing ever

  • @wenbowang4745
    @wenbowang4745 6 років тому

    Thanks, explaination is very clear.

  • @kumaraakash25
    @kumaraakash25 7 років тому

    Why not run a logistic regression here? What is the use of decision tree?

  • @kaushilkundalia2197
    @kaushilkundalia2197 6 років тому +1

    Very clear explanation ! Thank you sir ^_^

  • @KuldeepSingh-cm3oe
    @KuldeepSingh-cm3oe 6 років тому

    I think there should be an extra layer after rain about the humidity.....

  • @deathbyzergling
    @deathbyzergling 5 років тому

    Thank you, you just saved my neck. I was getting nowhere with my class material until I saw this. Where's the "next slide"?

  • @anuragdwivedi1804
    @anuragdwivedi1804 4 роки тому

    very good lecture
    thank u sir

  • @radisadek7601
    @radisadek7601 6 років тому

    Very very clear, thank you!

  • @lucasjohn3014
    @lucasjohn3014 6 років тому +2

    Nice lecture, but the voice is very very low .

  • @alen740926
    @alen740926 5 років тому

    Is this section so-called "ID3"?

  • @eng.rg9453
    @eng.rg9453 9 років тому

    thanks alot Victor ,, this example helps me so much , continue :)

  • @LittleSam129
    @LittleSam129 7 років тому

    thank you very much, nice explanation.

  • @ayushimathur27
    @ayushimathur27 7 років тому

    good video, it's really helpful

  • @vuanhachoi2409
    @vuanhachoi2409 6 років тому

    Why dont we use Temperature?

  • @ozgeozcelik8921
    @ozgeozcelik8921 8 років тому

    these are very helpful, thanks...

  • @dar7823
    @dar7823 5 років тому

    Thanks Victor!

  • @sridharsethu
    @sridharsethu 8 років тому

    PERFECET!! thanks victor.

  • @danielpelido8283
    @danielpelido8283 7 років тому

    What university is this from?

  • @nj356
    @nj356 7 років тому

    That was really helpful, thank you!

  • @ImtithalSaeed
    @ImtithalSaeed 8 років тому

    Good explanation

  • @amulyasaxena4851
    @amulyasaxena4851 5 років тому

    Great video.

  • @abdurrazzak305
    @abdurrazzak305 6 років тому

    Beautiful Explanation :)

  • @IsaacAsante17
    @IsaacAsante17 6 років тому

    Great tutorial!

  • @anjanashrestha4627
    @anjanashrestha4627 8 років тому +1

    Thank u for explaining it in so simple way . but here you didn't considered the humidity factor while taking the decision.. :/ . It would have been better if u have included of selection of the attribute factor in this slide itself.anyway thank u

  • @nicholasflores9871
    @nicholasflores9871 8 років тому

    What book are you using?