Tremendous Explanation! This is what even courses should focus on. Instead of just giving details on the surface and start importing packages and implementing for viewer's satisfaction, it is more fruitful to start from the scratch, dig the mathematics and intuition behind and appreciate the concept.
Thank you very much for this and the following session's lecture. I got my CS degree 25 years ago, and it's nice to learn about things like how to automatically decide which questions to ask first.
Superb lecture..Thank you very much for sharing it..I was struggling with the subject before watching this video, but now am quite comfortable and i think ill be able to manage using decision trees in my project.. Thank you again :)
When I did the calculation for I(Patrons) at time roughly 46:36 for the number of bits of information, I get .541 (not .0541) as in his slide deck. Also, I had to find from a difference refernce that when you have a Log(0), which is normally undefined, they assume it is 0.
Your Lectures are very explanatory; even as an undergrad I understood them. Thanks! I was wondering if you covered multivariate decision trees in any of your lectures.
Excellent! Can subsequent levels in the tree use the same attribute for the decision at a node? For instance in the 4 color, 2 dimension example, if the root level split is based on x-sub-i, can the next level node use a rule based on x-sub-i (obviously a different split)?
+Jobsamuel Núñez Only the last term within the brackets contributes because 0*log2(x) = 0 and 1*log2(1) = 0. The expression simplifies to 1 - [6/12 * (-2/6*log2(2/6) - 4/6*log2(4/6))] = 0.5409....
its hard to make money in AI. No restaurant or builder can afford to hire someone to do AI. Only a small fraction of AI developers get a job, sadly AI is not really used everywhere.
Tremendous Explanation! This is what even courses should focus on. Instead of just giving details on the surface and start importing packages and implementing for viewer's satisfaction, it is more fruitful to start from the scratch, dig the mathematics and intuition behind and appreciate the concept.
Excellent! This is how a teacher should teach.
Thank you very much for this and the following session's lecture. I got my CS degree 25 years ago, and it's nice to learn about things like how to automatically decide which questions to ask first.
Superb lecture..Thank you very much for sharing it..I was struggling with the subject before watching this video, but now am quite comfortable and i think ill be able to manage using decision trees in my project.. Thank you again :)
It amazes me that people were discussing these topics when I was studying about the water-cycle lol.
thank you very much..it really helped sir....and one thing I wanna tell that you have got a sweet voice.
Nice lecture! I came here for Decision Trees but I think I'll have a look at your other videos as well
When I did the calculation for I(Patrons) at time roughly 46:36 for the number of bits of information, I get .541 (not .0541) as in his slide deck. Also, I had to find from a difference refernce that when you have a Log(0), which is normally undefined, they assume it is 0.
I think they don't assume log(0) to be zero but 0*log(0) to be zero.
yes it is 0*log(0), but also all log calculations are with base 2.
Best Lecture on Decision Tree.Which measure is the best - Entropy or Gini?
Your Lectures are very explanatory; even as an undergrad I understood them. Thanks! I was wondering if you covered multivariate decision trees in any of your lectures.
great lecture, I have a question, is there any session for building decision tree manually?
I suppose this is how Arkinator guess who you are thinking of.
Excellent! Can subsequent levels in the tree use the same attribute for the decision at a node? For instance in the 4 color, 2 dimension example, if the root level split is based on x-sub-i, can the next level node use a rule based on x-sub-i (obviously a different split)?
The most clear ML course I had
Great lecture. crystal clear!
"To understand what a forest is we first need to understand the tree" :D
Over 200kg? That's a whale! Awesome lecture by the way :)
Good lecture on decision tree. Can you please share Antonio Criminisi technical report link here.
Thank you.
+Mohammad Kamruddin Google this:
"Decision Forests for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning"
Hey Ore! Did you find any lecture on multivariate decision trees?
can you provide the link for the report by antonio criminisi referred by you at 52:50
Nice Explanation to decision tree :)
Does anyone know where is the data file available or we just type it in from the slide Prof has
nice lecture.Thankyou very much sir..Can anybody share the referenced 'Criminisi et al, 2011' paper link?
All log calculations for entropy are with base 2 ??
Yes!
22:08 square yards?
awesome lectures by this teacher btw
Thank you so much..!!!
Thank you.
"If you go to the left, you are 100% red"
ist es erlaubt, das video auf z.b. sozialen Plattformen zu teilen?
Das Video befindet sich auf UA-cam. Solange nur ein HTTP Verweis (URL) benutzt wird, ja natürlich.
Could you help me with the calculations at 48:23? I haven't figured it out why I(Patrons) is equal to 0,541 bits :(
Jobsamuel Núñez Remember to use logarithm base 2. Most calculators use natural logarithm by default.
+Jobsamuel Núñez Only the last term within the brackets contributes because 0*log2(x) = 0 and 1*log2(1) = 0. The expression simplifies to 1 - [6/12 * (-2/6*log2(2/6) - 4/6*log2(4/6))] = 0.5409....
+Tobias Pahlberg Exactly, so that means that there still is a typo in the lecture, right? Since he states 0.0541..
edit: wooohps, nevermind
zwep Yes, but I think someone in the audience pointed that out later
Excellent
great shareing,Thank you.
Thanks
fantastic............:)
this is excelent but i want to learn m5 model tree any one help me how to learn any linkgive me
Patron is pronounced "pay-tren" :)
Did I here a freudin slip? He said arround 22:55 in a greece a greedy fashion. :-). Greece is not greedy but media make us believe?
its hard to make money in AI. No restaurant or builder can afford to hire someone to do AI.
Only a small fraction of AI developers get a job, sadly AI is not really used everywhere.
Not correct
not correct at all
hahahahahahahahahahahahahahahaha.....nice one....hahahah :P
So boring lecturer, I would drop the course if he is teaching