10.4: Neural Networks: Multilayer Perceptron Part 1 - The Nature of Code

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 301

  • @somecho
    @somecho 4 роки тому +83

    One year ago I was watching your tutorials on how to draw squares on a canvas with code. One year later I'm trying to build machine learning models, also with the help of your tutorials. I'm not even a CS student, I'm a pianist!

    • @rakib17874
      @rakib17874 3 роки тому +3

      great! Do u play in concert??

  • @stopaskingmetousemyrealnam3810
    @stopaskingmetousemyrealnam3810 4 роки тому +2

    Taking it back to Boolean Algebra makes it very clear why MLPs are a natural solution to the XOR problem, thank you. Nobody's done that yet in anything I've seen, even though it's obvious in hindsight and maybe should have been obvious in advance.

  • @bilourkhan3345
    @bilourkhan3345 5 років тому +197

    A happy face always helps to learn with ease and fun. Keep it up man !

    • @saramariacl
      @saramariacl 4 роки тому +2

      yes, is so true

    • @bilalazeemshamsi7895
      @bilalazeemshamsi7895 3 роки тому +1

      Sir, the question is how can a person who is in this field be this happy? lol :P

  • @optymystyc
    @optymystyc Рік тому +7

    I'm here because the coursera course instructor in the class I'm taking just can't explain it with as much joy and happiness as you. I feel like the information I'm getting here is paired with enthusiasm and that's the way it should first be introduced to my brain.

  • @seemarai5310
    @seemarai5310 6 років тому +31

    Well, i have to say you can be elected for the best teacher award. You are simply a perfect teacher.

  • @jonkleiman8018
    @jonkleiman8018 7 років тому +10

    It's because of your teaching that I've decided to pursue a career in this field. A brilliant balance of fun and seriousness.

  • @8eck
    @8eck 4 роки тому

    Linearly separable and linearly not separable explanation is the best explanation! It's now logical, why there are multiple layers was required! GREAT! Thank you!

  • @joachimsaindon3658
    @joachimsaindon3658 3 роки тому +1

    This video is a great example of why your channel is one of my favorites.

  • @muskansaxena5708
    @muskansaxena5708 6 місяців тому

    The way you teach is fun , it's like you're yourself enjoying teaching, which us students love ...one could fall in love with the knowledge presented here!!

  • @shedytaieb1083
    @shedytaieb1083 3 роки тому +2

    Man, you have no idea how the content you're creating is useful and interesting GOOD JOB

  • @chandranshsharma1685
    @chandranshsharma1685 6 років тому +6

    Amazing teacher.I have my semester exam tomorrow and was searching a lot about multi layer perceptron on the internet and wasn't able to find good explanation.thank god I found your video.💙

  • @LeandroBarbksa
    @LeandroBarbksa 6 років тому +9

    I love how excited you are explaining this.

  • @morphman86
    @morphman86 5 років тому +1

    I have been trying for quite some time to figure out what the "hidden layer" is, how it works and what the purpose is. So many others either get right up to that subject and then stop posting, or talk about it as if I should already know.
    So for some time, I have only been able to do simple perceptrons.
    Now I finally understand that hidden layers are just layers of multiple perceptrons being pushed into other perceptrons, where each perceptron has been trained to complete a different task.
    Thank you!

  • @vanshitagupta4183
    @vanshitagupta4183 2 роки тому +4

    You teach this subject with such passion. It is kinda getting me excited about learning it too

  • @graju2000
    @graju2000 5 років тому +2

    Man I wish they'd give you nobel prize for teaching!

  • @anshrastogi9430
    @anshrastogi9430 2 роки тому +1

    I literally want this sort of sense of humour in my college professor. Thanks for saving my semester. Love from India.

  • @marcocastellano2451
    @marcocastellano2451 5 років тому +3

    I found you years ago when I needed to learn steering algorithms. You made the math and algorithm simple(r) to understand and your videos are a lot like a drop of sunshine in my day. It reminds me of Reading Rainbow when I was a youngster. Now I am back to continue my work on CNNs. And there you are again in my suggested videos :D

  • @paulorugal
    @paulorugal 6 років тому +11

    You're the BEST CS TEACHER THAT I NEVER HAD

  • @henriqueb287
    @henriqueb287 3 роки тому +2

    Keep going man, I wish you were my teacher from college. Fun, smiles and learning together. Such a great experience to learn with you, 15 minutes passed like nothing but full of knoweledge. Love from Brazil! Keep going!

  • @FredoCorleone
    @FredoCorleone 2 роки тому +2

    What a master. We are really fortunate to have Daniel as instructor here on UA-cam!

  • @critstixdarkspear5375
    @critstixdarkspear5375 6 років тому

    You should be given your own show on the science network. More educational, fun, engaging and entertaining than 99% of the crap we pay for. Better than most courses I have seen on programming. Bill Nye + Bob Ross + Mr. Rogers. 11/10

  • @TheTimeforwar
    @TheTimeforwar 4 роки тому

    If every black kid in the hood had a teacher like this they'd all succeed at understanding this easily; why? Because this guy's likability makes you want to learn. When you enjoy the person teaching you, you will usually enjoy 'what' they're teaching you. The 'capacity' to 'understand' has very little to do with 'achievability' in human affairs & 'thinking' certainly pertains to human affairs. I'm understanding concepts I've never encountered before, not because I'm 'smart', but because the instructor in this video is interesting, funny, has a charm individually his own and is not intimidating or threatening in anyway, least of all, neither is he boring. Every young person deserves a teacher like this.

  • @danishshaikh2994
    @danishshaikh2994 2 роки тому +2

    Man, I'm speechless, god level explanation 🔥🔥🔥

  • @ericmrozinski6143
    @ericmrozinski6143 8 місяців тому

    An excellent and exciting explanation! This is exactly what I was looking for in trying to understand the motive behind the multi-layer perceptron. Not to be taken for granted!

  • @Sripooja.Mahavadi
    @Sripooja.Mahavadi 5 років тому +2

    How can someone dislike his video
    .he seems to be a genuine happy man..exuding joy...let him be :) The kind of excitement he has towards his code is what I need towards my life ;)

  • @scipsyche5596
    @scipsyche5596 7 років тому +11

    Good job The topic is very interesting, what's more interesting is the way he teaches☺

  • @parths.1903
    @parths.1903 3 роки тому

    This dude is so awesome, I can watch him teach all day. Love you, pal.

  • @justincollinns
    @justincollinns 6 років тому +2

    Your answer to "But what is XOR really?" at 10:46 was just what I needed! Thank you!

  • @CloverSerena
    @CloverSerena 3 роки тому

    I like you. You are the ideal teacher. The genuine sincere pleasure of teaching what you love to others. I can feel that love.

  • @anaibrahim4361
    @anaibrahim4361 Рік тому

    I was extremely happy when I discovered that you had posted a video on a topic that I was searching for.

  • @anonymousvevo8697
    @anonymousvevo8697 2 роки тому

    the only channel with no haters ! amazing sir! good luck love you

  • @najibsaad5765
    @najibsaad5765 7 років тому +63

    You are outstandingly interesting. Keep going!

  • @mkthakral
    @mkthakral 3 роки тому

    Teachers like you are so rare. Gem.

  • @joshvanstaden7615
    @joshvanstaden7615 3 роки тому

    Give this man some Concerta! Lol, in all honesty, I love being taught by people who are passionate about what they do. Keep it up!

  • @SidVanam
    @SidVanam 4 роки тому +2

    Cool to see how you linked the "Linearly Seperable" terminology to the boolean Truth tables! - Learned something applicable and new!

  • @redIroncool
    @redIroncool 6 років тому +6

    I actually love your enthusiasm!!!!

  • @Matt23488
    @Matt23488 5 років тому +32

    "Maybe you just watch my previous videos on the Perceptron"
    Yes. Yes I did.

  • @kineticsquared
    @kineticsquared 6 років тому

    Outstanding explanation of linearly separable. You make it very easy to understand why multiple perceptrons are required. Plus I love Boolean logic. Thank you.

  • @rogerhom1512
    @rogerhom1512 Рік тому

    I've seen a lot of videos about neural networks, both advanced ones (which go over my head) and beginner ones (which are too general). That XOR example in this video was an epiphany for me! Now I have an intuitive sense of what makes neural networks so special (vs., say, linear classifiers.) Now I feel like I'm finally ready to go deeper into this subject

    • @TheCodingTrain
      @TheCodingTrain  Рік тому

      I'm so happy to hear this!

    • @rogerhom1512
      @rogerhom1512 Рік тому

      ​@@TheCodingTrain Yah, that bit about how a single layer network can only solve linearly-separable problems, and how hidden layers fix this limitation, finally makes intuitive sense to me thanks to the XOR example. Thanks! Not sure if you cover this in subsequent videos, but I'd be interested to hear your take about why having multiple hidden layers can be useful, vs. just one hidden layer.

  • @fernandolasheras6068
    @fernandolasheras6068 4 роки тому

    OMG. Best video of NN basics concepts by far. And craziest too. Very fun to watch. Congrats!!!

  • @backtashmohammadi3824
    @backtashmohammadi3824 3 роки тому

    Holly Juice. That was an amazing explanation. My Professor at the uni confused me a lot . but this video made my day

  • @d.g.7417
    @d.g.7417 2 роки тому

    I'm speechless. What a beautiful explanation!

  • @waisyousofi9139
    @waisyousofi9139 2 роки тому

    What a nice teacher.
    truly enjoying the way you teach and convey your knowledge..
    plz keep going.....

  • @waisyousofi9139
    @waisyousofi9139 2 роки тому

    it is a unique talent to teach and bring smile at the same time.
    Wow..

  • @usmanmehmood7614
    @usmanmehmood7614 7 років тому +2

    this video just made me simply happy. Great Thanks from Pakistan. NUST needs to hire such professors

  • @samwakieltojar8154
    @samwakieltojar8154 4 роки тому +1

    this man has ENERGY

  • @ahmarhussain8720
    @ahmarhussain8720 3 роки тому

    I got that click where you suddenly understand a concept, by watching this video, thanks so much

  • @battatia
    @battatia 7 років тому +5

    third! Really appreciating these tutorials, much friendlier than others!

  • @kumudtripathi4054
    @kumudtripathi4054 5 років тому

    Loved the way you are teaching...I have already known mlp but your way of teaching makes me watch it again

  • @kdpoint4221
    @kdpoint4221 5 років тому

    U made me understand better than any simplified notes.......

  • @webberwang6520
    @webberwang6520 6 років тому +1

    I haven't heard this great an explanation before on UA-cam, great stuff!

  • @nicholask9251
    @nicholask9251 7 років тому

    Great videos and tutorials, Big fan here. Cool that you dont just make code but also explain the concept at beginning.

  • @carlosdebourbondeparme6021
    @carlosdebourbondeparme6021 4 роки тому +1

    you can only lunch if you are hungry AND thirsty. love the videos :)

  • @grainfrizz
    @grainfrizz 7 років тому +4

    6:57 genius. Very effective teacher

  • @TheAsimjan
    @TheAsimjan 4 роки тому

    Amazing explaining... Magically deliver a complex topic

  • @HeduAI
    @HeduAI 5 років тому +2

    Awesome explanation! You are so gifted!

  • @Cipherislive
    @Cipherislive 5 років тому

    What a genius teacher you are . Appreciate you sir

  • @leylasuleymanli725
    @leylasuleymanli725 7 років тому

    today i should study MLP but because of my some problems i could not concentrate.But after watching your tutorial you make me smile and forget about problems and understand the topic.Thanks a lot :)

  • @missiongrandmastercurvefev8726
    @missiongrandmastercurvefev8726 7 років тому

    Awesome. Your way of teaching is perfect.

  • @Sworn973
    @Sworn973 7 років тому +1

    Interesting, so basically same analogy to electronics building Logic gates from transistors. You kind like add they together to get more complex operations. Very good material. Keep going, I'm really into this

  • @doug8171
    @doug8171 6 років тому

    Great example of the need for more than one perceptron layer for the XOR.

  • @drakshayanibakka11
    @drakshayanibakka11 4 роки тому

    More excited to watch your videos. keep rocking with your enthusiasm

  • @venkatdinesh4469
    @venkatdinesh4469 3 роки тому

    ur teaching style is really awesome....

  • @Bo_om2590
    @Bo_om2590 7 місяців тому

    This guy has a golden heart

  • @jt-kv3mn
    @jt-kv3mn 5 років тому

    this is more than just an Neural Networks tutorial! thx

  • @4Y0P
    @4Y0P 7 років тому +1

    I love the way you explain things, energetic but informative, loving these videos!

  • @my_dixie_rect8865
    @my_dixie_rect8865 6 років тому +1

    Love this video. Explained it really well. I have an exam on Wednesday which covers MLP and the functions of layers and neurones. This should help form my answer.

  • @AM-jx3zf
    @AM-jx3zf 4 роки тому +1

    wow this guy is so animated. instantly likeable.

  • @yisenliang8114
    @yisenliang8114 Рік тому

    Fantastic explanation! This is just what I need.

  • @baog4937
    @baog4937 6 років тому

    Sir, your method is Excellent

  • @kashan-hussain3948
    @kashan-hussain3948 5 років тому

    Thank you Sir for making concepts easier.

  • @morphman86
    @morphman86 5 років тому

    Another way to see linearly separable problems: If it has a binary output, as in it either is or it isn't.
    With the dots on the canvas, they are either below the line, or they aren't. We just picked "aren't" to mean "above", but that's how we humans chose to read the output.
    We read it as "below" or "above", the computer reads it as "is" or "isn't".
    If you draw a line across your data and define a relationship between the data point and the line, the point either falls into that relationship, or it doesn't.

  • @furrane
    @furrane 7 років тому +1

    Great video as usual Dan, I'm looking forward to the sequel =)
    On a side note, I think everyone here understands !AND but the usual way is to call this gate NAND (for Not AND).

  • @RafaelBritodeOliveira
    @RafaelBritodeOliveira 7 років тому

    I'm really enjoying those videos. Thank you very much for all your hard work.

  • @NightRyder
    @NightRyder 5 років тому +2

    Thanks got my exam in 8 days!

  • @60pluscrazy
    @60pluscrazy 3 роки тому

    Very well explained and expressed 👌🙏

  • @tecnoplayer
    @tecnoplayer 7 років тому

    Thanks for teaching us assembly, sensei.

  • @gururajahegdev2086
    @gururajahegdev2086 2 роки тому

    Very Nicely Explained. Great Tutorial

  • @mohamedchawila9734
    @mohamedchawila9734 5 років тому +2

    Ting!!!
    i've learned something, 'xor' => 8:04

  • @montserratcano2389
    @montserratcano2389 7 років тому +2

    Great video! Thank you very much! You just save my academic life :)

  • @KishanKa
    @KishanKa 6 років тому

    Nice way you have explained the basics, thanks 😊

  • @PoojaYadav-hr2ub
    @PoojaYadav-hr2ub 4 роки тому

    Woweeeeee ... Another level of explanation

  • @likeyou3317
    @likeyou3317 6 років тому +2

    Damn Dan you seem to be such a lovely person and I say it as a man! Keep doing these tutorials becouse I don't know if there is any other channel on yt explaining neural networks on code as good as you do it.

  • @wawied7881
    @wawied7881 7 років тому +7

    Goodjob! Quite interesting topic

  • @elizabethmathewst
    @elizabethmathewst 5 років тому

    Beautiful presentation

  • @algeria7527
    @algeria7527 7 років тому

    i realy love the way you teaches. good work keep up.

  • @raitomaru
    @raitomaru 6 років тому +4

    Really enjoyable class!

  • @Smile-to2ii
    @Smile-to2ii 2 роки тому

    I love your energy and smiling face.

  • @N00byEdge
    @N00byEdge 7 років тому

    Hey there. I work as an artificial intelligence expert. I write state of the art neural networks libraries in C++ for a living. If you would like to talk about NNs with me in personal, or just ask me any questions, feel free to do so. I like your teaching style and I think knowledge about these kinds of things should be more universal!

    • @N00byEdge
      @N00byEdge 7 років тому

      And with that, of course I can also help with training algorithms and how to work with your data when using neural networks.

  • @sachinsharma-kw4zd
    @sachinsharma-kw4zd 6 років тому +2

    You are amazing bro.keep it up.i m learning a lot from you

  • @eassis2
    @eassis2 4 роки тому

    Outstanding teaching method, really thank you.

  • @nageshbs8945
    @nageshbs8945 4 роки тому

    11.40 very well explained thankyou!!

  • @sarveshrajan1624
    @sarveshrajan1624 6 років тому +1

    awesome and easy explanation. thanks!

  • @augustoclaro
    @augustoclaro 7 років тому +1

    thank you so much man! your videos are the best I found about the subject. you are a genious!

  • @xavmanisdabestest
    @xavmanisdabestest 5 років тому

    Wait so perceptrons are these crazy learning logic gates that work on linear systems. That's rad!

  • @gabrielaugusto6001
    @gabrielaugusto6001 7 років тому

    Dude, thanks for the lessons, keep doing them, plz, thanks.

  • @learnapplybuild
    @learnapplybuild 6 років тому

    So Much Excitement you have to share knowledge ......i liked that gesture .... keep it up dude ...Thank you

  • @cajogos
    @cajogos 5 років тому

    Awesome explanation, subscribed!

  • @Vikram-od6ur
    @Vikram-od6ur 4 роки тому

    Thank you for making these videos

  • @endritnazifi3356
    @endritnazifi3356 Рік тому

    Amazing explanation