I Built a Neural Network from Scratch

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 546

  • @Green-Code
    @Green-Code  5 місяців тому +340

    I'm not an AI expert by any means, I probably have made some mistakes. So I apologise in advance :)
    Also, I only used PyTorch to test the forward pass. Apart from that, everything else is written in pure Python (+ use of Numpy).

    • @CrushedAsian255
      @CrushedAsian255 5 місяців тому +20

      "umm Numpy is a 'prebuilt framework'" ☝🤓

    • @przemysawtomala9304
      @przemysawtomala9304 5 місяців тому +5

      5:44 gradient descent won't ever jump like that with your implementation (without momentums), cause function in that place is rising

    • @ProgrammerRajaa
      @ProgrammerRajaa 5 місяців тому

      Is that code available in GitHub if so can you share the link plse

    • @genildademeloescrituraviva
      @genildademeloescrituraviva 5 місяців тому +5

      Building a neural network from scratch? what about IN scratch?

    • @emmanuelikhide8998
      @emmanuelikhide8998 5 місяців тому

      Hey nice stuff @Green-Code I did enjoy the video. It's nice that the UA-cam algorithm recommended it because I'm also building a nn from scratch although I'm currently debugging the hell out of it and trying to optimimze 😂,
      Thanks for the content amazing stuff

  • @fragly
    @fragly 5 місяців тому +673

    Ngl using that chef hypothetical is such a neat way of explaining how a neural network functions

    • @akshikaakalanka
      @akshikaakalanka 5 місяців тому +3

      Yeah loved it!

    • @raynlaze1339
      @raynlaze1339 3 місяці тому +1

      lol.. "neat" way

    • @Jamdoe
      @Jamdoe 2 місяці тому

      As aways, any example or concept can be explained by using food, or anything that is food related.

    • @ByteBringer
      @ByteBringer 12 днів тому

      Yea useful for explaining neural nets for 5 year olds...

  • @Yuzuru_Yamazaki
    @Yuzuru_Yamazaki 5 місяців тому +324

    "let's think of every neuron as a chef... Now , Let 'em cook 🗿" ahh explanation 😭

  • @burrdid
    @burrdid 5 місяців тому +1568

    now do it in scratch

    • @zennihilator3724
      @zennihilator3724 5 місяців тому +61

      What does scratch mean? Does he have to make his own programming language too? Does he have to make his own computer? Does he have to design his own pcbs inside the computer? Does he have to put his own layer of silicon in a resin case and dope it? Does he have to generate his own electricity to power his house?

    • @KA-kf4ke
      @KA-kf4ke 5 місяців тому +145

      Scratch.
      'Website' Scratch.
      Y'know... the coding language for babies...

    • @ipigtaiwan
      @ipigtaiwan 5 місяців тому

      ​@@zennihilator3724scratch the programming language

    • @insaanonline
      @insaanonline 5 місяців тому +12

      ​@@zennihilator3724that means explain all the code one by one

    • @mesh_devo
      @mesh_devo 5 місяців тому

      @@zennihilator3724
      it has 2 meanings
      first one is from zero or step by step
      exp : i built my web from scratch ( means manually from zero)
      second
      is SCRATCH a web and app that help you to create simple games

  • @joshcannon6704
    @joshcannon6704 Місяць тому +21

    I have a few different neural networks I made from scratch in excel, they run slower but really helped me understand all the math that goes into them

    • @AKG58Z
      @AKG58Z Місяць тому +1

      What how ?

    • @junacik9967
      @junacik9967 24 дні тому +2

      I guess this is the really hard core way. I hope you're not heating your toasts on a sun :D

    • @joshcannon6704
      @joshcannon6704 23 дні тому +1

      @@AKG58Z ah, maybe I should make a video on how to. I basically have all the formulas in there and the data set and make it run with a macro

    • @AKG58Z
      @AKG58Z 23 дні тому

      @@joshcannon6704 yeah sure don't forget to tag me

    • @ik6071
      @ik6071 16 днів тому

      @@joshcannon6704 link the video when u do make it

  • @SomethingSmellsMichy
    @SomethingSmellsMichy 5 місяців тому +78

    3:12 the equation states that the loss of a network that returns probabilities with values from 0 to 1 is the expected output × the negative log of the actual output. The reason this works is because -log from 0 to 1 gives a higher loss as the probability approaches 0 and almost no loss as it approaches 1. Multiplying that by the expectes probability makes it so that the network only adjust the values for the outputs you want to approach 1.

    • @victor3btn598
      @victor3btn598 5 місяців тому +3

      Binary Cross entropy

    • @Exkolix
      @Exkolix 5 місяців тому +9

      this is why love nerds🤩

    • @quantumHumans
      @quantumHumans Місяць тому +2

      @@Exkolix great put, same

    • @gym_spur
      @gym_spur 16 днів тому

      that all went over my head bro, wish I was that smart too

  • @DK-ox7ze
    @DK-ox7ze 5 місяців тому +37

    It's pretty cool to implement all this from scratch. I had studied all this a few months ago but forgot most of it because I never practiced it. But this served as a refresher.

    • @NihadBadalov
      @NihadBadalov 5 місяців тому +1

      hi, where did you study it?

    • @ammglitch03
      @ammglitch03 3 місяці тому +1

      Same I had a summer class but we instead used Tensor flow and OpenCV libraries rather than coding from scratch.
      The concepts are very abstract but videos like these help me remember and realize what I was doing exactly.

  • @glumpfi
    @glumpfi 5 місяців тому +14

    I went through the same adventure :D I wrote a neural net from scratch in C++ just to get a deep understanding. The backpropagation part took me a while to figure out. I just got to an accuracy of 94% with MNIST, maybe because I still didn't implement optimizers and batches. Thanks for sharing :)

  • @adityavardhanjain
    @adityavardhanjain 3 місяці тому +23

    I was really shocked in my second year when I too realised that it in fact is all math.

  • @pentasquare
    @pentasquare 5 місяців тому +18

    "Wait it's all maths?"
    "Always has been"

  • @exor6100
    @exor6100 4 місяці тому +5

    Having also made a neural network from scratch, I would recommend anyone else undertaking such a project to buy Neural Networks from Scratch by Kinsley and Kukiela. I don't know if this guy used that book, but its a wonderful reference.

  • @Alex-ns6hj
    @Alex-ns6hj Місяць тому +2

    I know nothing about this. Just a noob here in sophomore year learning maths and STEM exploring. I really really find this fascinating, although I can’t really understand it yet because the maths are beyond me right now and I suck at coding, I really want to learn this. I can just tell there’s a beauty to this I can’t yet see but I want to uncover that no matter how long it takes me.

  • @wealth-wise2day
    @wealth-wise2day 5 місяців тому +3

    "its getting 40-50% accuracy it sucks"
    I know this seems bad greencode but you just taught a computer how to recognize things that we previously thought were only recognizable by humans. thats not bad. good job.

  • @ShabJimJets
    @ShabJimJets 5 місяців тому +10

    very nice mr green code very nice. You deserve a lot more subs for how good these videos are. Cant wait to see what the future holds

  • @benedictbrophy5651
    @benedictbrophy5651 2 місяці тому +1

    Honestly would love a video(series) that explains and derives the math

  • @santiagogonzalez-hc1vp
    @santiagogonzalez-hc1vp 5 місяців тому +2

    What a great vid, new sub
    You summarize accurately two weeks of class of a ML Master where I didn't slept
    Great job doing that and understanding the fundamentals
    Ignore bad comments
    Keep the pace

  • @Ari-pq4db
    @Ari-pq4db 5 місяців тому +8

    Subscribed, can't wait for such more informative videos ❤🔥

  • @Hangglide
    @Hangglide 5 місяців тому

    thanks! really cool! especially I just learned neural network and watching your video reinforced what I just learned from the class.

  • @lukemaj_
    @lukemaj_ 2 місяці тому

    this was super cool! Keep up with these great vids! Nicely explained, convinced me to try on my own implementing a custom neural net ^^

  • @medakshchoudhary
    @medakshchoudhary 5 місяців тому +2

    loved the way how you explained all of this can you pleaseeeee make a beginner guide tutorial on how to get on to things like this exactly this type of things
    you got a sub

  • @dhiraj6727
    @dhiraj6727 4 місяці тому

    The background music makes it look like this topic is just really really cool. Or maybe it's just because I have watched gaming videos with such bg music that similar bg music videos make me feel interested in the topic.

  • @chrisx2342
    @chrisx2342 5 місяців тому +1

    ur expalination is absolutely fantastic!!

  • @abdulhadiaa3372
    @abdulhadiaa3372 4 місяці тому

    You are the best man, you managed to turn a boring topic into a movie. I think you are going places in the content creation industry. Keep going man 🙌🥇

  • @golgiguy456
    @golgiguy456 3 місяці тому +3

    That was cool. Now do it "in" scratch

  • @shivanta2
    @shivanta2 Місяць тому

    You explained it so well, I loved it even though I understand the math behind it. ❤

  • @softwareengineer8923
    @softwareengineer8923 15 днів тому

    It was extremely useful, thanks a lot 👍

  • @alienwhitewalker7284
    @alienwhitewalker7284 3 місяці тому

    Man mastered, If you can't explain it simply, you don't understand it well enough.

  • @احمدعاطف-ت7ذ9ص
    @احمدعاطف-ت7ذ9ص 3 місяці тому

    This is my first time watching you. Bro you are awesome, keep going, I love your explanation.

  • @divandebruin5767
    @divandebruin5767 3 місяці тому +1

    Hey man just wanted to find out where you made your avatar! Thanks ahead 🙏🏻

  • @EchoPrograms
    @EchoPrograms 5 місяців тому +2

    I just did the same thing a few days ago lol! (Also from scratch). I did it in JavaScript so I didn't even have numpy. My matrix class is like 150 lines long lol

  • @enerz9135
    @enerz9135 5 місяців тому

    Your Exaplation Is Very Simple It's Very Easy To Undertand

  • @nicolasdelphino6434
    @nicolasdelphino6434 3 місяці тому

    Dude, i'm actually in the master degree's classes, not understanding shit. I just watch your vid, and it looks bright as the sun

  • @Badaboombadaboombaby
    @Badaboombadaboombaby 3 місяці тому +3

    Mad respect for animating your every syllable! Amazing lipsyncing animation ❤❤❤

  • @LudieMasu
    @LudieMasu 3 місяці тому

    your creativity and passion shine through every project!

  • @Bigleyp
    @Bigleyp 5 місяців тому +1

    Now do if but where there are regions, a neuron has multiple connections and it is either on or off based on how many recent signals go into it.

  • @adebayokehinde1580
    @adebayokehinde1580 5 місяців тому

    Making a tutorial is one thing and adding animations is 🔥Great!!

  • @helved807
    @helved807 5 місяців тому +2

    This was super interesting! Mainly because this is something I've wanted to do myself, but I've had some troubles implementing backpropagation. Any tips on how to implement it?

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 5 місяців тому +2

      *Disclaimer: After writing an explanation, I assumed you understand calculus, but all the steps are broken down and the equations are solved fully.
      In this video, you were introduced to the loss function. This function does as it states: Quantifying the difference or incorrectness of your Neural Network.
      Ideally, you want your loss function to be 0 or close to 0. Assuming your loss function is either cross-categorical cross entropy (like in this video) or a more common approach: Mean Squared Error (Which has the formula 1/2(y-o)^2 where y is your expected output and o is your actual output).
      *In case you want to paste this into an appropriate calculator/document and because I've been practicing; here is the LaTex version of that equation \frac{1}{2}\left(y-o
      ight)^{2}.
      Since these functions are never negative, you can assume that your loss function reaches 0 at its absolute minimum (usually you'll hit a local minimum for complicated problems).
      For Instance, we can tell if a function "f" hit a minimum or a maximum by checking to see if f'(x)=0 (same as 0=d/dx f(x)) or if its slope is equal to 0.
      Since we know that if the slope is positive at a point as x goes towards infinity then it's rising. If the slope is negative at a point as x goes towards infinity then it's falling we can find the x position of the local minimum by iteratively changing x by x->x - f '(x) * lr (lr being the learning rate).
      X is usually represented as the input to the function, but since we want to change the weights and not the input, we can assume that the input to your network is constant and that the weights are the input.
      So since you've already done the forward pass, you likely already know that a network can be structured like this: o = f (W * x). Where o is the output, f is your activation function, W is the weights matrix, and x is your vector of inputs.
      Assuming you're using the Mean Squared Error, let's try to find the function's derivative with respect to your weights. The entire function is 1/2(y - f (W * x))^2.
      The chain rule tells us that we can find the derivative dE/dW (E being the cost function) by solving dE/do * do/dW (see how do cancels).
      *For the sake of visuals this is the LaTex equation: \frac{d \cdot E}{d \cdot o} \cdot \frac{d \cdot o}{d \cdot W}
      dE/do would be represented as (y - o) or (y - f(W * x)) and do/dW would be f '(W * x) * x. Meaning that dE/dW would be (y - f(W * x)) * f '(W * x) * x.
      So now you can update your weights with W -> W - dE/do. And for multiple layers, you need to pass dE/dx to the next layer which in this case is (y - f(W * x)) * f '(W * x) * W (Notice how you multiply by the weights instead of the input).

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 5 місяців тому +1

      I'd recommend making a class that deals with a single layer of the network. The class should have a method called backward (for backward pass or backpropagation, but you can call it whatever you want) that takes in the error and the inputs for that layer. In the method, multiply the error by the derivative of your activation function and the inputs and add that result to your weights. The method should also return the error multiplied by the derivative of your activation function BUT then multiplied by your weights (preferably before you update them). If you want, I can give you some code to reference.

    • @UrFada
      @UrFada 5 місяців тому +1

      @@SomethingSmellsMichy I love your explaination Understood 70% of it even with little calculus and linear algebra background as am In grade 0 but started practicing calculus and linear algebra for making my own nerual network only part I don't really get is the end, and I find it hard to code something except I can fully visualize how it works so it has been troublesome trying to fully understand it

    • @SomethingSmellsMichy
      @SomethingSmellsMichy 5 місяців тому

      @@UrFada I can imagine that the end starts to become more complicated as it divulges more into symbols. I was kinda trying to wrap it up because of the character limit on replies.

    • @UrFada
      @UrFada 5 місяців тому +1

      @@SomethingSmellsMichy Ahh yes I will try to better visualize the comment later and maybe I will able to understand but overall thank you for the explaination

  • @rnts08
    @rnts08 5 місяців тому

    Your code bullet is better than codebullet. You are what everyone hoped that he would be with his enigma video. Keep it up! 😂

  • @abdelrahmanmahmoud-fc5yh
    @abdelrahmanmahmoud-fc5yh 4 місяці тому

    Cleverly and simply explained, great video

  • @afrateam6241
    @afrateam6241 5 місяців тому

    Only a genius could understand how genius you are . Wow 🎉

  • @lxhub
    @lxhub 2 місяці тому

    Thank you brother. you're a LEGEND!!!

  • @commissariomontanaro2931
    @commissariomontanaro2931 5 місяців тому +14

    why does my acoustic pattern recognition match your avatar poses to Code Bullet?
    Nice video tho, now do it in C to assert dominance

    • @ericdanfunk3966
      @ericdanfunk3966 5 місяців тому +1

      This content creator is a clone of Code Bullets style 🥱

    • @Smurdy1
      @Smurdy1 5 місяців тому +4

      I think it's funny that I'm reading this while waiting for my C++ neural network to train. My code isn't even that optimized and my program only took about 5 minutes to go through the process of training a neuron 49 billion times. It's insane how much faster C++ is at machine learning (and everything) than python.

    • @ggsap
      @ggsap 3 місяці тому

      @@Smurdy1 pytorch is written in C++. numpy is written in C.

    • @Smurdy1
      @Smurdy1 3 місяці тому

      @@ggsap But still, those things only make up a portion of the code in most Python AIs. The rest goes as fast as Python does.

    • @ggsap
      @ggsap 3 місяці тому

      @@Smurdy1 did you do a benchmark?

  • @mohamedyacinehamiham6019
    @mohamedyacinehamiham6019 11 днів тому

    keep going mate , We believe in you

  • @jollyehiabhi360
    @jollyehiabhi360 Місяць тому

    I love your explanation. Which parameter(Hyperparameter) did you tweak to move from 50% accuracy to 97% accuracy?

  • @CastyInit
    @CastyInit 4 місяці тому +1

    relu is just math.max(0,x), or just a fancy way of saying "if the output of a neuron is

  • @ICEcoldAryan
    @ICEcoldAryan 4 місяці тому

    Now I know why there are so many maths classes in my uni curricullam.

  • @BlackKiller-j7m
    @BlackKiller-j7m Місяць тому

    Bro it awesome explanation and this visualization is very helpful to understand. Keep it up and I hate the maths but you skip and explain in simple it good for many people. I think you have to make video on maths for ai and machine learning need to solve

  • @johnyohan7022
    @johnyohan7022 4 місяці тому

    i have a physics end sem exam tmr, this has no connection to that yet here i am watching this

  • @TandaiAnxle
    @TandaiAnxle 4 місяці тому

    You should upload more def one of new fav youtubers in general

  • @cubix02
    @cubix02 3 місяці тому

    I’ve got to say-this video is like a shot of espresso for my brain cells! ☕

  • @Monirulislam-c1t
    @Monirulislam-c1t 3 місяці тому +1

    9:00 the f*cking biritish moment😭😭

  • @BabyYoda5555
    @BabyYoda5555 4 місяці тому

    Cool. This explains forward back propagation. Now explain forward forward propagation.

  • @jonathandyer6385
    @jonathandyer6385 4 місяці тому +1

    Description: "do not click on this: "
    Me: *Clicks it*
    Me: *Sees a youtube page that says: "are you sure you want to leave youtube?"*

  • @Siddhi_99
    @Siddhi_99 2 місяці тому

    You are so wierd, yet Funny and Interesting. Loved the way u explained

  • @familytharun3924
    @familytharun3924 5 місяців тому

    the video is too gud bro , u have taken lots of efforts..

  • @KeithReactsTECH
    @KeithReactsTECH 4 місяці тому

    YOUR EDITING SKILLS ARE MONEY! YOU SHOULD MAKE VIDEOS IN A DIFFERNT GENRE THAN CS VIDEOS, BETTER RPM AND MORE VIEWRERS

  • @suvojitsengupt
    @suvojitsengupt 5 місяців тому

    cool just subscribed , hope u will continue to post this kind of stuff, loved your work ...

  • @progerua
    @progerua Місяць тому

    yooo, very cute video, but from who u learned all that stuff?

  • @-PeterAndrewNamoraMarpaung
    @-PeterAndrewNamoraMarpaung 2 місяці тому +1

    make it from scratch, on scratch, with a scratch, and top it with a neural network title made from SCRATCH

  • @gabrielrock
    @gabrielrock 5 місяців тому

    awesome video bro, congrats! I’d like to see a video on how to go from that to a generative AI or a RAG, u know? Are u planning some video like that or do u got some reference?

  • @trickyfox9518
    @trickyfox9518 Місяць тому +1

    This guy has been calling us dumb for 9 minutes

  • @Avion_Animz
    @Avion_Animz 5 місяців тому

    Hey man I just discovered your channel and I love it and I've been wondering how long have you been do this programming I mean

  • @weslycosta348
    @weslycosta348 5 місяців тому

    so underratted channel man, keep goin!

  • @TheFuture36520
    @TheFuture36520 5 місяців тому

    Bruh is implementing mathematical equations like bernoullis theorems and second order differential equations 😂.
    You're the best brother ❤️ 💙

  • @rinorajeti4
    @rinorajeti4 Місяць тому

    I love the way this video is made.
    - "Some of this guys invented the dot product"
    I can't stop laughing on this one.

  • @GauravGiri-i6r
    @GauravGiri-i6r 4 місяці тому

    Bro love the video....
    Can you go in-depth in the coding part. Like every step by step code you used.

  • @cen-t
    @cen-t 5 місяців тому

    Nice video, btw do you do live coding session ? Like on twitch or here on yt

  • @sarimshaikh5224
    @sarimshaikh5224 5 місяців тому +1

    Sir, u drop this sir 👑 , please wear it from next video

  • @MrJack-ku6qh
    @MrJack-ku6qh 7 днів тому

    6:19 that is the realty of programing😭😭

  • @rezamokhtari165
    @rezamokhtari165 2 місяці тому

    I was stuck with bugs as I began training my model , until I saw this, now I'm stuck with more bugs 😅

  • @nogabs4422
    @nogabs4422 3 місяці тому

    how did you create your character? was it written in code XD
    btw i like the character and simple movements, how you do that? im wanting to learn how to youtube but i'm not a fan of being in camera

  • @abadishaikh
    @abadishaikh 5 місяців тому

    I would watch a 2h video of this. I dont want cuts and entertainment.

    • @Green-Code
      @Green-Code  5 місяців тому

      I know, but 2h of this is a whole as movie. It takes me like 50 hours or more just to make one of these videos :)

  • @stayhappy-forever
    @stayhappy-forever 5 місяців тому +5

    Can you open source the code if you don't mind? I worked on the same project but there are some improvements I feel like I can make.
    Edit: I implemented SGD and got 94%-95% accuracy, with 16 hidden neurons and only a singular hidden layer (10 epochs). is it possible if you can share your model architecture? Thanks!

    • @yds6268
      @yds6268 5 місяців тому

      Can you share yours? I know links in comments are impossible, but the repo name would be amazing

    • @stayhappy-forever
      @stayhappy-forever 5 місяців тому

      @@yds6268 Do you want the whole code? or do you just want an understanding/run through of what i made

  • @veifingtps4697
    @veifingtps4697 5 місяців тому

    I remember my first timw doing this went crazy because didnt want to do the math but wasnt so bad lol crazy good video and visualization

  • @nesquickyt
    @nesquickyt 5 місяців тому

    Your videos are really good and interesting 🔥

  • @sirosala
    @sirosala 5 місяців тому

    Excelente el video !!, muy bien explicado, brillante aporte de conocimiento. Saludos desde Rosario - Argentina.

  • @Podcast.Motivator
    @Podcast.Motivator 5 місяців тому

    Awesome bro. Waiting for more videos like this.

  • @akhildonthula6160
    @akhildonthula6160 5 місяців тому

    Please do more videos , i luv the way you do💥

  • @howelljian4130
    @howelljian4130 3 місяці тому

    i just realized his channel description says "Hi! I make videos about AI and programming :)" (ASCII)

  • @KSBallvardhan-n3v
    @KSBallvardhan-n3v 4 місяці тому +3

    Bro can u pls provide a roadmap to learn to do like you with resources, it will be a great help, thank you

  • @marlonochoaj
    @marlonochoaj Місяць тому

    Awesome video man. 🎉❤

  • @PythonIsGod
    @PythonIsGod 5 місяців тому

    Wow man, Python is love, love is Python.

  • @kelp3835
    @kelp3835 4 місяці тому +1

    I have a fun fact if you have been or you know alot about computer science you would know that in the future AI is so terrifying and advance that you can't tell if you are talking to an AI or not😨

    • @christopherjenkins6174
      @christopherjenkins6174 3 місяці тому

      Future? Over half the total traffic on any given page on the internet is non-human bruv. We’re already there 😭 a huge portion of comments on all the social media platforms are generative AI

  • @butterdogoficcial
    @butterdogoficcial Місяць тому

    Wow that's great! now do it in C.

  • @dhanushlnaik
    @dhanushlnaik 4 місяці тому

    didnt understand shit but you actually made me feel like ik a lot of shit Thanks

  • @Hoover04
    @Hoover04 5 місяців тому

    bro oversimplified one of the most complicated concepts.... your education is insane i hope i get to that level soon

  • @SamIsPoggers
    @SamIsPoggers Місяць тому

    good video, i have a question however (might be a silly question but) would making and training a chatbot be similar to this or that would be a different thing?

    • @Green-Code
      @Green-Code  Місяць тому

      I'm not an expert on that, but that has to do with transformers (which is a different neural network architecture). So it's related, but not exactly the same. If you're interested search up Andrej Karpathy. He's the goat

    • @SamIsPoggers
      @SamIsPoggers Місяць тому

      @@Green-Code ohh okay thanks, I really am interested in learning about this ai stuff, I recently just started trying to find good ai videos

  • @wolfgangsell3233
    @wolfgangsell3233 5 місяців тому +1

    W video! Can you show off your code for the beginners here?

  • @asdfsdfsdfsdf
    @asdfsdfsdfsdf 2 місяці тому

    I really dont know wtf happend but I watched the whole video it was fun !

  • @hrshlgunjal-1627
    @hrshlgunjal-1627 5 місяців тому +1

    Amazing video bro. Nice explanation, truly 👍

  • @gentleman.editsss
    @gentleman.editsss Місяць тому

    Next video: Building a Neural Network Using Assembly👨‍💻

  • @brawldude2656
    @brawldude2656 5 місяців тому +1

    Backpropagation couldn't be explained simpler

  • @yassinechritt8816
    @yassinechritt8816 5 місяців тому

    Great explanation! keep on postig great stuff 😎😎

  • @aierik
    @aierik 17 днів тому

    Best explained...

  • @Sumii4242
    @Sumii4242 5 місяців тому

    Someone know what the muisc at 6:35 is?
    ah and also, this is seriousely such a cool video, great job mate!

  • @baonguyen4278
    @baonguyen4278 5 місяців тому

    Great work bro! u got another subscriber !!

  • @hssengamer4591
    @hssengamer4591 18 годин тому

    Ay man i used you video as a source for my report i hope you don’t mind
    Im in computer engineering and the report is about (Neuromorphic Computing Using FPGAs)

  • @rizamaeburlat8801
    @rizamaeburlat8801 4 місяці тому +4

    3:39 How the heck is that math, that's absolutly crazy i bet it will took people to solve this

  • @Concreteblockmachineug
    @Concreteblockmachineug 4 місяці тому

    When you ask the network if it's a 7 or not..does it initially have a stored reference parameter set to compare with in order to know if the given photo is of a 7?

  • @Evanarrations
    @Evanarrations 3 місяці тому +1

    How did you make that talking cartoon