Neural Networks from Scratch - P.4 Batches, Layers, and Objects

Поділитися
Вставка
  • Опубліковано 15 чер 2024
  • Neural Networks from Scratch book: nnfs.io
    NNFSiX Github: github.com/Sentdex/NNfSiX
    Playlist for this series: • Neural Networks from S...
    Neural Networks IN Scratch (the programming language): • Neural Networks in Scr...
    Python 3 basics: pythonprogramming.net/introdu...
    Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
    Mug link for fellow mug aficionados: amzn.to/2KFwsWn
    Channel membership: / @sentdex
    Discord: / discord
    Support the content: pythonprogramming.net/support...
    Twitter: / sentdex
    Instagram: / sentdex
    Facebook: / pythonprogramming.net
    Twitch: / sentdex
    #nnfs #python #neuralnetworks

КОМЕНТАРІ • 956

  • @Yguy
    @Yguy 4 роки тому +646

    I swear I am addicted to these more than Netflix.

    • @MasterofPlay7
      @MasterofPlay7 4 роки тому +6

      lol you are weird xD

    • @usamanadeem7974
      @usamanadeem7974 4 роки тому +11

      I literally wait for UA-cam’s notification telling me Sentdex dropped a new tutorial 😂

    • @ossamaganne5851
      @ossamaganne5851 4 роки тому +5

      @@usamanadeem7974 me too brother ahahahah

    • @NicolasKaniak
      @NicolasKaniak 4 роки тому

      same here

    • @Blahcub
      @Blahcub 3 роки тому +1

      cringe

  • @AaditDoshi
    @AaditDoshi 4 роки тому +262

    I don't even look at my calendar anymore. My week ends when sentdex drops a video.

  • @soroushe6394
    @soroushe6394 4 роки тому +305

    I’m glad I’m living at a time that people like you share their knowledge in such quality for free.
    Thank you 🙏🏻

    • @slamsandwich19
      @slamsandwich19 Рік тому +1

      I was going to say the same thing

    • @MeinGoogleAccount
      @MeinGoogleAccount 7 місяців тому +1

      yes, absolutely. i come from a time where programming meant you buy a book that acually was outdated the moment you bought it.
      thank you 🙂

  • @sayanguha5570
    @sayanguha5570 4 роки тому +243

    Everytime I see a neural network tutorial they start as "import tensorflow as tf" without giving a shit about basic..but this is a very detailed basic clearing video, truly from scratch...THANK YOU FOR THE GOOD WORK

    • @lucygaming9726
      @lucygaming9726 4 роки тому +4

      I agree with you, although you can check out Deeplearning.ai on Coursera. It's pretty good.

    • @aleksszukovskis2074
      @aleksszukovskis2074 4 роки тому +7

      @@lucygaming9726 No thanks . Im too poor for that.

    • @janzugic6798
      @janzugic6798 3 роки тому +1

      @@aleksszukovskis2074 its free and by andrew ng, the legend

    • @aleksszukovskis2074
      @aleksszukovskis2074 3 роки тому

      @@janzugic6798 thanks

    • @supernova6553
      @supernova6553 2 роки тому +2

      @@janzugic6798 you need a coursera subscription ($49/mo) after 7 day trial period regardless of the course being free

  • @sentdex
    @sentdex  4 роки тому +109

    Errata:
    16:17: initially this anim was incorrect when I recorded. We fixed the anim, but not the audio, resulting in my reading of the incorrect first row of values incorrectly. We're adding row vectors here, so the anim is correct, the words are not. =]

    • @usejasiri
      @usejasiri 4 роки тому +1

      Please clarify the concept of the Gaussian Distribution that you introduced when talking about np.randn

    • @anjali7778
      @anjali7778 4 роки тому

      if i draw a neural network of 12 inputs imaging into 3 output and
      connect each neurons to the output, there will be 36 lines in total,
      that means there has to be about 36 weights but the weight you took had
      only 12 weights in array, how is that possible ?

    • @mayaankashok2604
      @mayaankashok2604 4 роки тому +1

      @@anjali7778 He has only 4 inputs to the output layer... therefore number of weights = 4*3 = 12
      If instead , you have 12 inputs ,you will get 12*3 = 36 weights

    • @fincrazydragon
      @fincrazydragon 11 місяців тому

      Am I wrong, or is there something missing around the 9:08 point?

    • @dragonborn7152
      @dragonborn7152 10 місяців тому

      Question: why did we need to transpose weights 2 since they are both 3x3 matrices, index1 of the would equal index 0 right?

  • @Blendersky2
    @Blendersky2 Рік тому +17

    Just imagine if we have tutorials like these on all the AI and Machine learning topics and also on probability and statistics. .. man, every few minutes in the video I try to scroll the video list up and down with the hope that there will be 700 more videos like these but it shows only 7 videos. Amazing work, I will order your book now. Appreciate your dedication and hard work

  • @amogh3275
    @amogh3275 4 роки тому +26

    Bruh this visualisation... Its unreal🔥

    • @Saletroo
      @Saletroo 4 роки тому +3

      ASMR for eyes, thanks Daniel!

    • @amogh3275
      @amogh3275 4 роки тому

      @@Saletroo ikr 😂😂

  • @usamanadeem7974
    @usamanadeem7974 4 роки тому +15

    The thing I love about you is just how beautifully you explain concepts, with immaculate animations and then literally make such complex tasks seem so easy! Gonna make my kids watch your tutorials instead of cartoons one day ♥️😂

  • @knowit3887
    @knowit3887 4 роки тому +36

    U r just ... God for teaching programming... I am glad to have u as a teacher... 💪

  • @ramyosama8088
    @ramyosama8088 4 роки тому +14

    Please continue with this playlist This is hands down the best series on youtube right now !!!

    • @sentdex
      @sentdex  4 роки тому +6

      No plans to stop any time soon!

  • @prathamkirank
    @prathamkirank 4 роки тому +16

    This is the online classes we all deserve

    • @theoutlet9300
      @theoutlet9300 4 роки тому +1

      better than most ivy league schools

  • @aoof6742
    @aoof6742 4 роки тому +79

    I really appreciate you doing this mate, I really wanted to learn Neural Networks and you are explaining this soo good.

    • @sentdex
      @sentdex  4 роки тому +7

      Glad to hear it!

  • @Gorlung
    @Gorlung 3 роки тому +7

    this is actually the first NN tutorial during which I haven't felt asleep..
    ps. thank you for explaining some of the things twice!

  • @kaustubhkulkarni
    @kaustubhkulkarni 4 роки тому +6

    I’d kind of given up on understanding ML and NN. Then I saw Neural Networks from scratch and Sentdex CANNOT make this easier. Loving this series.

    • @7Trident3
      @7Trident3 4 роки тому +1

      I banged my head on numerous videos too. They assume a level of knowledge that was hard to peice together. This series is filling lots of gaps for me. The concepts are starting to jell, this whole field is fascinating!! Kind of empowering.

  • @asdfasdfasdf383
    @asdfasdfasdf383 4 роки тому +1

    You have created one of the best series on this topic I have found on the internet. Explanations include everything, yet you still proceed at a fast steady pace.

  • @ambarishkapil8004
    @ambarishkapil8004 4 роки тому +11

    I know I have said this before, but I am going to say it again, and keep on saying it till you continue to make such awesome tutorials. Thank you!

  • @prathamprasoon2535
    @prathamprasoon2535 4 роки тому +19

    This is awesome! Finally, a series on neural nets I can understand easily.

  • @ginowadakekalam
    @ginowadakekalam 4 роки тому +10

    This channel is so good that you'll never find any negative comments

    • @sentdex
      @sentdex  4 роки тому +5

      They are there sometimes :) but yes fairly rare.

    • @usejasiri
      @usejasiri 4 роки тому

      -comment

  • @bas_kar_na_yar
    @bas_kar_na_yar 4 роки тому +19

    I wish anyone had ever taught me any concept the way you do..

  • @chaosmaker781
    @chaosmaker781 2 роки тому +3

    this is better explained and with more quality than any neural network video where the concept is mostly shown just by the code

  • @codiersklave
    @codiersklave Рік тому +3

    Still one of the best series on UA-cam to learn the basics of neural networks... fast!

  • @kenbinner
    @kenbinner 4 роки тому +4

    I'm really glad you took the time to break down this concept step by step, will surely reduce the number of headaches in the future!
    Thank you for your great content looking forward to the next one. 😄

  • @rrshier
    @rrshier 3 роки тому +3

    At about 14:51, where you present the matrix multiplied by the vector, the proper mathematical notation would be to have the vector as a column vector, as well as the output vector being a column vector. This is truly how the matrix multiplication is able to work, because a vector is truly just a matrix where one of the dimensions is equal to 1. Other than that, I have to admit, these are my FAVORITE AI/ML videos yet!!!

    • @pensivist
      @pensivist 10 місяців тому

      I was looking for this comment. Thanks for pointing that out!

  • @Alex-ol9dk
    @Alex-ol9dk Рік тому +1

    I have never bought a book from UA-cam before but you will be the first. You’ve deserved it. Absolutely love this work. Please keep it up

  • @aav56
    @aav56 2 роки тому +12

    I've never learned linear algebra and I'm astounded how simple you made matrix multiplication out to be!

  • @keshavtulsyan7515
    @keshavtulsyan7515 4 роки тому +3

    Feels like learning all the day, it never felt so simple before...thanks a lot 🙏🏻

  • @chaks2432
    @chaks2432 3 роки тому +1

    This is my first time learning about Neural Networks, and you're doing a great job at explaining things in an easy to understand way.

  • @carlossegura403
    @carlossegura403 3 роки тому +2

    Back when I was learning the concepts behind building a network, most tutorials went straight into the maths, while that is fine - what I wanted to understand was the different compositions from the input to the output. This video was what I was looking for back then before going deep into the theory and methodology. Great content!

  • @jonathantribble7013
    @jonathantribble7013 4 роки тому +108

    Friend: "So what do you do in your free, unwind, leisure time?"
    Me: "Neural Networks From Scratch"
    Friend: "..."

    • @alexgulewich9670
      @alexgulewich9670 3 роки тому +5

      Sister: "If that's informative, then what's educational"
      Me: "Glad you asked!" *starts to explain neural networks and basic QP*
      Sister: "NO! Make it stop!" *Never asks again*

    • @stevenrogersfineart4224
      @stevenrogersfineart4224 3 роки тому

      Story of my life 😁

  • @Alfosan2010
    @Alfosan2010 4 роки тому +89

    last time I was this early, Corona was just a beer brand...

  • @littlethings-io
    @littlethings-io 3 роки тому +1

    Just ordered the book - can't wait to dive into it. Thanks you, this is good stuff and a priceless contribution to the evolution of this area of science.

  • @jsnadrian
    @jsnadrian 4 роки тому

    i can't believe you created this course - absolutely fantastic and wonderfully thoughtful in its layout - thanks so much

  • @yabdelm
    @yabdelm 4 роки тому +22

    This is the best series by far I've ever seen. Just what I was looking for. I wonder if you'll get into explaining the why also.
    For instance, often times when I'm watching I do wonder "Why do we even have biases? What function do they serve? How do they enhance predictions? What sort of history/science/neuroscience underlies that and where do AI and neuroscience partways if so? Why does all of this work at all?"

    • @asongoneal28
      @asongoneal28 4 роки тому +3

      Youssef I really hope @sentdex reads this ;)

    • @naseemsha3010
      @naseemsha3010 4 роки тому +4

      I think it was explained in a previous video, how biases help making predictions. Check out the last video guys

    • @carloslopez7204
      @carloslopez7204 4 роки тому +1

      He explained that in previous videos, but no all your questions

    • @yabdelm
      @yabdelm 4 роки тому +1

      @@carloslopez7204 I agree it was explained a bit but I really didn't feel the explanation gave me a deep understanding of the why unfortunately, just a very rough surface level and vague hint of what might be going on.

    • @liadinon1134
      @liadinon1134 4 роки тому

      I think that now some things, like the biases dont make sense now. But when you get into training(the lerning process) is all start to make sense.

  • @lemoi6462
    @lemoi6462 4 роки тому +6

    The interesting part will be the backward propagation, im really looking forward to this

  • @hasneetsingh
    @hasneetsingh 4 роки тому

    Your explanations are so clear, I really appreciate the hard work you've been through to design this series to make such complex topics so much fun to learn :) . Enjoying a lot

  • @devinvenable4587
    @devinvenable4587 10 місяців тому

    I watching this as a refresher as I studied this topic a few years ago, and I find the context you provide really useful. Thanks!

  • @classicneupane6196
    @classicneupane6196 4 роки тому +13

    Understood batch size finally

    • @sentdex
      @sentdex  4 роки тому +1

      Glad we could help!

  • @JackSanRio
    @JackSanRio 4 роки тому +3

    I pre-ordered the book because this is interesting and I a eager to learn more

  • @peppep4426
    @peppep4426 4 роки тому +1

    This reminds me of the best TV series ... You finish one episode and look forward to the next ...
    Good job!

  • @clementsiow176
    @clementsiow176 4 роки тому

    Never have I been so exciting for a new UA-cam video, you have earned my respect

  • @shubhamdamani1057
    @shubhamdamani1057 4 роки тому +9

    Can you please provide a visual representation of how the batches pass along. I mean by using animation using bubbles and lines like you did in the initial videos.

  • @patrickvieira9200
    @patrickvieira9200 4 роки тому +4

    well finally looks like my linear algebra class was not a waste of time at all

  • @bradley1995
    @bradley1995 11 місяців тому +1

    I just want to again say thank you so much for these videos. They are top notch. It truly has helped me get a deep understanding compared to what many other "tutorials" have. Plus all this information being provided free. I feel blessed!

  • @yuwankumar
    @yuwankumar Рік тому +1

    After many searches I found this playlist! Thank you for making this Gold.

  • @harikalatheeswaran9206
    @harikalatheeswaran9206 4 роки тому +4

    For people watching this video... remember this golden rule :
    Say we have two Matrices A&B..in order to multiply A with B,i.e A.B
    The number of Columns of Matrix A should be equal to number of Rows of Matrix B.
    That's why A.B != B.A
    Amazing video 👍!
    Thanks a lot !
    Keep up the amazing work !

  • @thenotsogentlecat5847
    @thenotsogentlecat5847 4 роки тому +9

    Sentdex: we're arriving at the sexy parts...
    Python: Oh, yes I am ;)

    • @kelpdock8913
      @kelpdock8913 3 роки тому

      x = we're arriving at the sexy parts...
      print(x)

  • @tymion2470
    @tymion2470 Місяць тому

    I'm very thankful for this series, I just learn so much new thing, because you're so good in explaining, and there yet 5 videos to watch!

  • @lonnie776
    @lonnie776 4 роки тому

    You are doing a great job explaining these concepts in a way that is easy to understand. I can't wait for the next part so I am ordering the ebook.
    Great job.

  • @DRIP-DRIP-DRIP
    @DRIP-DRIP-DRIP 4 роки тому +9

    Never clicked on a video so quickly

  • @franky0226
    @franky0226 4 роки тому +5

    Notification => nnfs P4.
    Me: clicks on the button faster than the speed of light

  • @gurns681
    @gurns681 2 роки тому

    Mate, this series is unreal! Love your work

  • @TNTeon
    @TNTeon 8 місяців тому

    Hey just to let you know, this video 3 years later continues to help and encourage new programmers!
    I'm in my freshmen year of highschool doing all gen ed courses, but I started working on this tutorial in my free time and I'm having a blast and actually understanding everything perfectly
    Just wanted to say thank you so much for really helping people like me in our learning of Computer Science and machine learning! These are awesome and super enjoyable!

  • @dippy9119
    @dippy9119 3 роки тому +3

    6:09 what's a fitment line? Google isn't helping me.

  • @amogh3275
    @amogh3275 4 роки тому +13

    16:19 you said the other way around by mistake.. shouldn't it be 2.8 +2, 6.8+2, -0.59+2..

    • @fl7977
      @fl7977 4 роки тому

      Yeah, that really confuse me more than it should have

    • @bipanbhatta2736
      @bipanbhatta2736 4 роки тому

      yes. It is called broadcasting.

  • @3alabo
    @3alabo 4 роки тому +1

    One of the best tutorials I have seen on the topic , Saludos de Argentina!

  • @merth17
    @merth17 4 роки тому +1

    I can’t wait to see the implementation of backpropagation with the chain rule, it’s so simple when you teach it. Tysm

  • @time2learn123
    @time2learn123 Рік тому +3

    Why does the dot product switch inputs and weights when working with batches. e.g when input is a 1D array the calculation in the code is np.dot(weights, inputs) but for batch it is np.dot(inputs, transposed_weights). Why doesnt it work when we transpose the inputs instead? Im sure Im missing something simple. Thanks for the videos they are amazing!

    • @joelgerlach9406
      @joelgerlach9406 Рік тому +2

      Because matrix multiplication is not commutative

    • @MrGeordiejon
      @MrGeordiejon Рік тому

      I think it is the nature of what we are doing - We are taking inputs - applying weights and biases - and delivering outputs. or entering exiting a decision. so we can't use an entrance to a neuron to exit another neuron.
      I think the demonstrations by Harrison are to cement the concept and awareness of ValueError shape()... and he also showed how multiplication works between array and vector (multiplication)
      I went back to lesson 3 for 2 things.
      1. I like inputs being the first entry so 'My doors' are labelled correctly
      2. Use the npArray().T in that example
      If he had not shown us 3 before 4 - I would have found it harder to appreciate transpose() - I don't think I will ever just reverse the args when I am coding this stuff.
      import this

    • @413blaze
      @413blaze Рік тому

      Something that I find interesting that I think might have to do with this is that specifically in the case of 1 dimensional arrays, the shape is different. I am used to thinking of a matrix as rows by columns. For example, [[2,2],[3,3]] , would be a 2 by 2 matrix. 2 rows and 2 columns; however, lets take the example [1,2,3,4] , I would have expected the shape of this to be 1 by 4 ( 1 row and 4 columns) but it is not. The shape of [1,2,3,4] is (4, 1) . So, the way to think about it is by elements in a list of lists. The first entry x in shape (x , y) is how many lists in a list and the second entry y is how many entries within each element or list. In his example, his first inputs [1,2,3,4] the shape is (4,1) and when he put [[1,2,3,4],[1,2,3,4],[1,2,3,4]] the shape became (3,4) and if you are thinking about this in rows and columns that wouldn't be the case. I hope that made some modicum of sense lol

    • @ollie6989
      @ollie6989 Рік тому

      You could perform the same operation by transposing the inputs however keep in mind the matrix rule (A.B)' == B'.A' , e.g. inputs.(weights.T) == (weights.(inputs.T)).T aka. the output of inputs.weights_transposed will be equal to the transposed output of weights.inputs_transposed, the issue with the values probably comes from adding the biases without first either transposing them or transposing this output matrix back, as they will be added in a completely different order.

  • @ahmedyamany5065
    @ahmedyamany5065 2 роки тому +3

    Great explanation and animation, but in 14:47 [1,2,3,2.5] in python is array which is vector or matrix (4,1) but when you write it in paper or animation you should write in vertical form like column, not row, because [1 2 3 2.5] in animation is matrix (1,4), not (4,1), so we can say every element in array [1,2,3,2.5] is row, 1 is 1st row, 2,5 is 4th row.

  • @Voyagedudimanche
    @Voyagedudimanche 4 роки тому +1

    Hello! I'am following you for more then 2 years and this is the best course for me! With those explanation of math - it is realy cool. Thank you for this work :)

  • @aamirkhanmaarofi9705
    @aamirkhanmaarofi9705 3 роки тому

    Watching this playlist is awesome, it made my task very easy. Have been stuck with the implementation of the multilayer perceptron for two days. Thanks

  •  4 роки тому +20

    0th! Finally!

  • @afafssaf925
    @afafssaf925 4 роки тому +4

    You are wayyyyy more buff than it seems by just your face.

    • @sentdex
      @sentdex  4 роки тому +1

      I'll keep that in mind

  • @NikhilSandella
    @NikhilSandella 4 роки тому

    This is the best channel with the best content, with amazing animation. Clear explanation. I'm in love with this man. :)

  • @DMBalchemy
    @DMBalchemy 4 роки тому

    Incredible as always. This one struck a few lightbulbs. Thanks again, Eagerly anticipating #5, I'll have to work through the draft to prep

  • @thomasnevolianis8616
    @thomasnevolianis8616 4 роки тому +4

    import neural_networks_from_scratch as nnfs
    from nnfs import moments
    best_moments = moments(channel='Sentdex')
    print(best_moments[0])
    ''The SEXY part of deep learning''

  • @sharanbabu2001
    @sharanbabu2001 4 роки тому +1

    Loving the effectiveness! The batch size explanation was amazing!

    • @sentdex
      @sentdex  4 роки тому +1

      Glad you liked it!!

  • @rakshitjoshi823
    @rakshitjoshi823 Місяць тому

    High quality animations. Much respect!

  • @saisiddhanthgujjari8954
    @saisiddhanthgujjari8954 4 роки тому

    Amazing content sentdex, the visualizations are just top notch and aid to a much clearer explanation.

  • @garymdmd
    @garymdmd Рік тому

    I am on lesson 4 now - you are such a great instructor, I love learning this stuff.

  • @FagunRaithatha
    @FagunRaithatha Рік тому

    This content is really good. Thanks for making this simple. I have been binge-watching your videos.

  • @frederick3524
    @frederick3524 4 роки тому

    I have been looking forward to this all week!

  • @realbingus
    @realbingus 4 роки тому

    At the point where I had a question, I had not fully watched the video yet. So I commented my question. Literally five seconds later in the video you answer my question in the video.
    I love the series, thanks for doing this!

  • @asu4908
    @asu4908 2 роки тому

    Doing gods work, ordered the book a while ago and finally have time to actually dive into this now-thank you so much bro

  • @jedisenpei855
    @jedisenpei855 4 роки тому

    Apart from trying to explain neural networks, you just explained the matrix dot product in the most intuitive way I have ever seen. I know how the dot product works by now, but I also remember how much work I had to give in to understand the concept given lectures and texts i had at university. I had to read through some difficult math equations and really think about what the book was trying to tell me, and I also had to go through a lot exercises to really get a grasp of it and remember it, and then you just explained it in 10 minutes and it makes perfect sense, although I had almost forgottes what it was all about. So easy. I wish my teacher had an animation like the one you show at 9:10. Then I wouldn't have had to struggle through the math classes, as much as I did, in my education as an electrical engineer.

  • @Gazarodd
    @Gazarodd 4 роки тому

    I think this tutorial serie will explode. Atm, it's really clear, you're fantastic

  • @hemanthkotagiri8865
    @hemanthkotagiri8865 4 роки тому

    I can't be more thankful for anyone than you and Daniel. Thank you so much!

    • @sentdex
      @sentdex  4 роки тому

      Happy to do it!

  • @keshan-spec
    @keshan-spec 4 роки тому

    I love this series and i always look forward for the next one. Thank you ❤

  • @chuckf5540
    @chuckf5540 3 роки тому

    Great explanation and very clear. I look forward to all videos. What a learning process!!

  • @ryangao3564
    @ryangao3564 4 роки тому

    Hey sentdex, such addictive content in your videos. Couldnt wait for the next release any longer. So I just pre-ordered the e-book.

    • @sentdex
      @sentdex  4 роки тому

      Woo! Hope you enjoy!

  • @Mayank25
    @Mayank25 3 роки тому

    This is the best tutorial ever I watched.. Kudos 👍🙌🙌🙌

  • @horseman3253
    @horseman3253 3 роки тому

    Wooow, this how all subjects in school should be explained, amazing visualization, very clear!

  • @dinarakhaydarova4898
    @dinarakhaydarova4898 Рік тому

    I thought I understood all of these concepts until I watched your tutorials. it's amazing!

  • @fanasisangweni8539
    @fanasisangweni8539 4 роки тому +1

    Im glad I found your channel man, i swear to god, your videos are awesome, Im only starting to understand ANNs after watching your videos.

  • @super7ace
    @super7ace Рік тому +1

    God level series on Neural Network. Good job and always proud of you buddy!!

  • @johnyeap7133
    @johnyeap7133 Рік тому

    Made the batch learning benefits really clear, thank you

  • @violinplayer7201
    @violinplayer7201 4 роки тому

    Best python neural networks video, for sure

  • @andrescontrol2866
    @andrescontrol2866 3 роки тому

    Very useful video and very well explained through the series. Thanks a lot Harry!

  • @brendensong8000
    @brendensong8000 3 роки тому

    Thank you for the clear explanation! I was completely lost after several videos! you made it so clear!

  • @sciWithSaj
    @sciWithSaj 3 роки тому +1

    Thanks a lot
    This will be my first object oriented programming.
    It was kind of daunting for me, but you made it so simple.

  • @minhtuecung5418
    @minhtuecung5418 3 роки тому

    Now that´s what I call real teaching: triggering curiosity ! Thank you so much, sentdex! Math rules!

  • @josephyu2110
    @josephyu2110 6 місяців тому

    Wow your video are just amazing, this clarity to explain complex thing is just incredible

  • @accounttwo5114
    @accounttwo5114 4 роки тому

    Fantastic, I'm really excited about the following videos!

  • @anilsarode6164
    @anilsarode6164 4 роки тому +2

    I think the single array of biases at 16:16 get added to the individual row of the dot product matrix is due to the Python broadcasting. Thanks a lot for this video series.

  • @bartosz13
    @bartosz13 11 місяців тому

    This is nuts. Crazy good quality

  • @danielbardsen4101
    @danielbardsen4101 4 роки тому

    Hi Sentex,
    Thanks for doing my engineering/programming career so much more interesting! You really are the best

  • @benjaminsteakley
    @benjaminsteakley Рік тому

    Took my five years to find something like your videos in 2022. I dropped put of college from stress and i can finally sit down and try to understand this math. I hope the video which explains linear regression is as good as these four so far

  • @HJ-jr7zd
    @HJ-jr7zd 4 роки тому

    Great video Sentdex. Looking forward to read the when it's out.

  • @unionid3867
    @unionid3867 2 роки тому

    jujur saya hampir putus asa mencari tutorial membuat neural network untuk pemula, beruntung saya menemukan video anda, terimakasih banyak

  • @tuhinmukherjee8141
    @tuhinmukherjee8141 4 роки тому

    This series is totally amazing! Thanks man

  • @raccoon_05
    @raccoon_05 Рік тому

    Thx so much for this series. You're really helping me understand the basic concepts behind this 👍👍👍