One year ago I was watching your tutorials on how to draw squares on a canvas with code. One year later I'm trying to build machine learning models, also with the help of your tutorials. I'm not even a CS student, I'm a pianist!
Taking it back to Boolean Algebra makes it very clear why MLPs are a natural solution to the XOR problem, thank you. Nobody's done that yet in anything I've seen, even though it's obvious in hindsight and maybe should have been obvious in advance.
I'm here because the coursera course instructor in the class I'm taking just can't explain it with as much joy and happiness as you. I feel like the information I'm getting here is paired with enthusiasm and that's the way it should first be introduced to my brain.
Linearly separable and linearly not separable explanation is the best explanation! It's now logical, why there are multiple layers was required! GREAT! Thank you!
The way you teach is fun , it's like you're yourself enjoying teaching, which us students love ...one could fall in love with the knowledge presented here!!
Amazing teacher.I have my semester exam tomorrow and was searching a lot about multi layer perceptron on the internet and wasn't able to find good explanation.thank god I found your video.💙
I have been trying for quite some time to figure out what the "hidden layer" is, how it works and what the purpose is. So many others either get right up to that subject and then stop posting, or talk about it as if I should already know. So for some time, I have only been able to do simple perceptrons. Now I finally understand that hidden layers are just layers of multiple perceptrons being pushed into other perceptrons, where each perceptron has been trained to complete a different task. Thank you!
I found you years ago when I needed to learn steering algorithms. You made the math and algorithm simple(r) to understand and your videos are a lot like a drop of sunshine in my day. It reminds me of Reading Rainbow when I was a youngster. Now I am back to continue my work on CNNs. And there you are again in my suggested videos :D
Keep going man, I wish you were my teacher from college. Fun, smiles and learning together. Such a great experience to learn with you, 15 minutes passed like nothing but full of knoweledge. Love from Brazil! Keep going!
You should be given your own show on the science network. More educational, fun, engaging and entertaining than 99% of the crap we pay for. Better than most courses I have seen on programming. Bill Nye + Bob Ross + Mr. Rogers. 11/10
If every black kid in the hood had a teacher like this they'd all succeed at understanding this easily; why? Because this guy's likability makes you want to learn. When you enjoy the person teaching you, you will usually enjoy 'what' they're teaching you. The 'capacity' to 'understand' has very little to do with 'achievability' in human affairs & 'thinking' certainly pertains to human affairs. I'm understanding concepts I've never encountered before, not because I'm 'smart', but because the instructor in this video is interesting, funny, has a charm individually his own and is not intimidating or threatening in anyway, least of all, neither is he boring. Every young person deserves a teacher like this.
An excellent and exciting explanation! This is exactly what I was looking for in trying to understand the motive behind the multi-layer perceptron. Not to be taken for granted!
How can someone dislike his video .he seems to be a genuine happy man..exuding joy...let him be :) The kind of excitement he has towards his code is what I need towards my life ;)
Outstanding explanation of linearly separable. You make it very easy to understand why multiple perceptrons are required. Plus I love Boolean logic. Thank you.
I've seen a lot of videos about neural networks, both advanced ones (which go over my head) and beginner ones (which are too general). That XOR example in this video was an epiphany for me! Now I have an intuitive sense of what makes neural networks so special (vs., say, linear classifiers.) Now I feel like I'm finally ready to go deeper into this subject
@@TheCodingTrain Yah, that bit about how a single layer network can only solve linearly-separable problems, and how hidden layers fix this limitation, finally makes intuitive sense to me thanks to the XOR example. Thanks! Not sure if you cover this in subsequent videos, but I'd be interested to hear your take about why having multiple hidden layers can be useful, vs. just one hidden layer.
today i should study MLP but because of my some problems i could not concentrate.But after watching your tutorial you make me smile and forget about problems and understand the topic.Thanks a lot :)
Interesting, so basically same analogy to electronics building Logic gates from transistors. You kind like add they together to get more complex operations. Very good material. Keep going, I'm really into this
Love this video. Explained it really well. I have an exam on Wednesday which covers MLP and the functions of layers and neurones. This should help form my answer.
Another way to see linearly separable problems: If it has a binary output, as in it either is or it isn't. With the dots on the canvas, they are either below the line, or they aren't. We just picked "aren't" to mean "above", but that's how we humans chose to read the output. We read it as "below" or "above", the computer reads it as "is" or "isn't". If you draw a line across your data and define a relationship between the data point and the line, the point either falls into that relationship, or it doesn't.
Great video as usual Dan, I'm looking forward to the sequel =) On a side note, I think everyone here understands !AND but the usual way is to call this gate NAND (for Not AND).
Damn Dan you seem to be such a lovely person and I say it as a man! Keep doing these tutorials becouse I don't know if there is any other channel on yt explaining neural networks on code as good as you do it.
Hey there. I work as an artificial intelligence expert. I write state of the art neural networks libraries in C++ for a living. If you would like to talk about NNs with me in personal, or just ask me any questions, feel free to do so. I like your teaching style and I think knowledge about these kinds of things should be more universal!
One year ago I was watching your tutorials on how to draw squares on a canvas with code. One year later I'm trying to build machine learning models, also with the help of your tutorials. I'm not even a CS student, I'm a pianist!
great! Do u play in concert??
Taking it back to Boolean Algebra makes it very clear why MLPs are a natural solution to the XOR problem, thank you. Nobody's done that yet in anything I've seen, even though it's obvious in hindsight and maybe should have been obvious in advance.
A happy face always helps to learn with ease and fun. Keep it up man !
yes, is so true
Sir, the question is how can a person who is in this field be this happy? lol :P
I'm here because the coursera course instructor in the class I'm taking just can't explain it with as much joy and happiness as you. I feel like the information I'm getting here is paired with enthusiasm and that's the way it should first be introduced to my brain.
Well, i have to say you can be elected for the best teacher award. You are simply a perfect teacher.
It's because of your teaching that I've decided to pursue a career in this field. A brilliant balance of fun and seriousness.
Best of luck to you!
Linearly separable and linearly not separable explanation is the best explanation! It's now logical, why there are multiple layers was required! GREAT! Thank you!
This video is a great example of why your channel is one of my favorites.
The way you teach is fun , it's like you're yourself enjoying teaching, which us students love ...one could fall in love with the knowledge presented here!!
Man, you have no idea how the content you're creating is useful and interesting GOOD JOB
Amazing teacher.I have my semester exam tomorrow and was searching a lot about multi layer perceptron on the internet and wasn't able to find good explanation.thank god I found your video.💙
I love how excited you are explaining this.
I have been trying for quite some time to figure out what the "hidden layer" is, how it works and what the purpose is. So many others either get right up to that subject and then stop posting, or talk about it as if I should already know.
So for some time, I have only been able to do simple perceptrons.
Now I finally understand that hidden layers are just layers of multiple perceptrons being pushed into other perceptrons, where each perceptron has been trained to complete a different task.
Thank you!
You teach this subject with such passion. It is kinda getting me excited about learning it too
Man I wish they'd give you nobel prize for teaching!
I literally want this sort of sense of humour in my college professor. Thanks for saving my semester. Love from India.
I found you years ago when I needed to learn steering algorithms. You made the math and algorithm simple(r) to understand and your videos are a lot like a drop of sunshine in my day. It reminds me of Reading Rainbow when I was a youngster. Now I am back to continue my work on CNNs. And there you are again in my suggested videos :D
You're the BEST CS TEACHER THAT I NEVER HAD
Keep going man, I wish you were my teacher from college. Fun, smiles and learning together. Such a great experience to learn with you, 15 minutes passed like nothing but full of knoweledge. Love from Brazil! Keep going!
What a master. We are really fortunate to have Daniel as instructor here on UA-cam!
You should be given your own show on the science network. More educational, fun, engaging and entertaining than 99% of the crap we pay for. Better than most courses I have seen on programming. Bill Nye + Bob Ross + Mr. Rogers. 11/10
If every black kid in the hood had a teacher like this they'd all succeed at understanding this easily; why? Because this guy's likability makes you want to learn. When you enjoy the person teaching you, you will usually enjoy 'what' they're teaching you. The 'capacity' to 'understand' has very little to do with 'achievability' in human affairs & 'thinking' certainly pertains to human affairs. I'm understanding concepts I've never encountered before, not because I'm 'smart', but because the instructor in this video is interesting, funny, has a charm individually his own and is not intimidating or threatening in anyway, least of all, neither is he boring. Every young person deserves a teacher like this.
Man, I'm speechless, god level explanation 🔥🔥🔥
An excellent and exciting explanation! This is exactly what I was looking for in trying to understand the motive behind the multi-layer perceptron. Not to be taken for granted!
How can someone dislike his video
.he seems to be a genuine happy man..exuding joy...let him be :) The kind of excitement he has towards his code is what I need towards my life ;)
Good job The topic is very interesting, what's more interesting is the way he teaches☺
This dude is so awesome, I can watch him teach all day. Love you, pal.
Your answer to "But what is XOR really?" at 10:46 was just what I needed! Thank you!
I like you. You are the ideal teacher. The genuine sincere pleasure of teaching what you love to others. I can feel that love.
I was extremely happy when I discovered that you had posted a video on a topic that I was searching for.
the only channel with no haters ! amazing sir! good luck love you
You are outstandingly interesting. Keep going!
Thanks for the nice feedback!
Teachers like you are so rare. Gem.
Give this man some Concerta! Lol, in all honesty, I love being taught by people who are passionate about what they do. Keep it up!
Cool to see how you linked the "Linearly Seperable" terminology to the boolean Truth tables! - Learned something applicable and new!
I actually love your enthusiasm!!!!
"Maybe you just watch my previous videos on the Perceptron"
Yes. Yes I did.
Outstanding explanation of linearly separable. You make it very easy to understand why multiple perceptrons are required. Plus I love Boolean logic. Thank you.
I've seen a lot of videos about neural networks, both advanced ones (which go over my head) and beginner ones (which are too general). That XOR example in this video was an epiphany for me! Now I have an intuitive sense of what makes neural networks so special (vs., say, linear classifiers.) Now I feel like I'm finally ready to go deeper into this subject
I'm so happy to hear this!
@@TheCodingTrain Yah, that bit about how a single layer network can only solve linearly-separable problems, and how hidden layers fix this limitation, finally makes intuitive sense to me thanks to the XOR example. Thanks! Not sure if you cover this in subsequent videos, but I'd be interested to hear your take about why having multiple hidden layers can be useful, vs. just one hidden layer.
OMG. Best video of NN basics concepts by far. And craziest too. Very fun to watch. Congrats!!!
Holly Juice. That was an amazing explanation. My Professor at the uni confused me a lot . but this video made my day
I'm speechless. What a beautiful explanation!
What a nice teacher.
truly enjoying the way you teach and convey your knowledge..
plz keep going.....
it is a unique talent to teach and bring smile at the same time.
Wow..
this video just made me simply happy. Great Thanks from Pakistan. NUST needs to hire such professors
this man has ENERGY
I got that click where you suddenly understand a concept, by watching this video, thanks so much
third! Really appreciating these tutorials, much friendlier than others!
Thanks, that's nice to hear!
Loved the way you are teaching...I have already known mlp but your way of teaching makes me watch it again
U made me understand better than any simplified notes.......
I haven't heard this great an explanation before on UA-cam, great stuff!
Great videos and tutorials, Big fan here. Cool that you dont just make code but also explain the concept at beginning.
you can only lunch if you are hungry AND thirsty. love the videos :)
6:57 genius. Very effective teacher
Amazing explaining... Magically deliver a complex topic
Awesome explanation! You are so gifted!
What a genius teacher you are . Appreciate you sir
today i should study MLP but because of my some problems i could not concentrate.But after watching your tutorial you make me smile and forget about problems and understand the topic.Thanks a lot :)
glad to hear!
Awesome. Your way of teaching is perfect.
Interesting, so basically same analogy to electronics building Logic gates from transistors. You kind like add they together to get more complex operations. Very good material. Keep going, I'm really into this
Great example of the need for more than one perceptron layer for the XOR.
More excited to watch your videos. keep rocking with your enthusiasm
ur teaching style is really awesome....
This guy has a golden heart
this is more than just an Neural Networks tutorial! thx
I love the way you explain things, energetic but informative, loving these videos!
Love this video. Explained it really well. I have an exam on Wednesday which covers MLP and the functions of layers and neurones. This should help form my answer.
wow this guy is so animated. instantly likeable.
Fantastic explanation! This is just what I need.
Sir, your method is Excellent
Thank you Sir for making concepts easier.
Another way to see linearly separable problems: If it has a binary output, as in it either is or it isn't.
With the dots on the canvas, they are either below the line, or they aren't. We just picked "aren't" to mean "above", but that's how we humans chose to read the output.
We read it as "below" or "above", the computer reads it as "is" or "isn't".
If you draw a line across your data and define a relationship between the data point and the line, the point either falls into that relationship, or it doesn't.
Great video as usual Dan, I'm looking forward to the sequel =)
On a side note, I think everyone here understands !AND but the usual way is to call this gate NAND (for Not AND).
oh, hah, yes, good point!
I'm really enjoying those videos. Thank you very much for all your hard work.
Thanks got my exam in 8 days!
Very well explained and expressed 👌🙏
Thanks for teaching us assembly, sensei.
Very Nicely Explained. Great Tutorial
Ting!!!
i've learned something, 'xor' => 8:04
Great video! Thank you very much! You just save my academic life :)
Glad to hear!
Nice way you have explained the basics, thanks 😊
Woweeeeee ... Another level of explanation
Damn Dan you seem to be such a lovely person and I say it as a man! Keep doing these tutorials becouse I don't know if there is any other channel on yt explaining neural networks on code as good as you do it.
Goodjob! Quite interesting topic
Beautiful presentation
i realy love the way you teaches. good work keep up.
Really enjoyable class!
I love your energy and smiling face.
Hey there. I work as an artificial intelligence expert. I write state of the art neural networks libraries in C++ for a living. If you would like to talk about NNs with me in personal, or just ask me any questions, feel free to do so. I like your teaching style and I think knowledge about these kinds of things should be more universal!
And with that, of course I can also help with training algorithms and how to work with your data when using neural networks.
You are amazing bro.keep it up.i m learning a lot from you
Thank you!
Outstanding teaching method, really thank you.
11.40 very well explained thankyou!!
awesome and easy explanation. thanks!
Thank you!
thank you so much man! your videos are the best I found about the subject. you are a genious!
Wait so perceptrons are these crazy learning logic gates that work on linear systems. That's rad!
Dude, thanks for the lessons, keep doing them, plz, thanks.
Thanks!
So Much Excitement you have to share knowledge ......i liked that gesture .... keep it up dude ...Thank you
Awesome explanation, subscribed!
Thank you for making these videos
Amazing explanation