How Neural Networks Actually Work (Explained Interactively)

Поділитися
Вставка
  • Опубліковано 21 жов 2024
  • What is a neural network exactly? Where did it come from?
    In this video I will show you the answers to those questions and the surprising link between them and computer logic gates.
    Timestamps:
    Help this channel grow by supporting me on Patreon:
    / jacksonzheng

КОМЕНТАРІ • 32

  • @captainjj7184
    @captainjj7184 Місяць тому

    Finally found someone with an amazing illustrative, ingenious educational conveying skills with great voice too lol. Thank you so much, just un-fried my brain watching this after trying to find this specific logic gate NN concept out there for days, brilliant presentation and thank you for sharing!

  • @GabrielCapano
    @GabrielCapano 2 місяці тому +1

    This is the better and simplest explanation of how a neural network works i have ever seen, thanks!

  • @joshhunt4431
    @joshhunt4431 2 місяці тому

    Nice video Jackson, your editing is so good now!

  • @ashbeigian233
    @ashbeigian233 3 місяці тому

    clicked on this video by accident, but I have to say this is one of the best videos i have ever seen outlining logic gates and neural networks. Keep it up man, your stuff is going to take off soon if you keep outputting this type of quality content that anyone in the field or not can find value out of.

  • @SahilThakur-p6p
    @SahilThakur-p6p 3 місяці тому +1

    woah great content man...thought for a sec you were one of those million subs channel...keep up the quality!

  • @cipher_angel
    @cipher_angel 3 місяці тому +2

    How does this have only 28 views? This is great content bro. Keep it up.

    • @Jackson_Zheng
      @Jackson_Zheng  3 місяці тому

      @@cipher_angel I know right? 🤣 Took me ages to programmatically animate everything in Manim too. Guess it might just be the titles and thumbnails.

    • @cipher_angel
      @cipher_angel 3 місяці тому

      @@Jackson_Zheng Impressive work sir. It'll pay off.

  • @ashali2226
    @ashali2226 2 місяці тому +1

    Brilliant stuff man!!

  • @ramanShariati
    @ramanShariati 2 місяці тому

    Bro keep up the good work. It takes time...

  • @jagermon
    @jagermon 2 місяці тому

    I liked this, very clear. Subscribed.

  • @zerosaturn416
    @zerosaturn416 2 місяці тому

    underrated video

  • @muhammadamjad4046
    @muhammadamjad4046 2 місяці тому

    this channel is hidden gold

  • @Adhil_parammel
    @Adhil_parammel 2 місяці тому

    Instead of weight there is mylin thickness in neuron to determine signal strength. neuron activation is based on criticality of signal strength accumulation.

  • @jorget8855
    @jorget8855 Місяць тому

    You should have used Op amps to model ANN's because ANN's are analog systems. Logic gates don't work because 1. You can't model 'weights' on the input signals. 2. The output signal swings fully high or low (example 3.3V to 0V) when input thresholds are met at typically 0.7*Vin for high for and 0.3*Vin for low, there is no in-between. This behavior won't work as an activation function because you need continuity with some gradual slope in between the high and low transition for information to adequately propagate through the network to function as an ANN, or else it would just be an ordinary combinational digital circuit. Op amps meet all the ANN criteria when you model each unit with negative feedback; the feedback resistors being the weights of each input neuron. The outputs of the opamps will be similar to the ReLU function.

  • @hiddendrifts
    @hiddendrifts 2 місяці тому

    idk what it is but smth about the way you speak makes me think of some british guy telling a friend at the bar about his day

  • @jason9522
    @jason9522 2 місяці тому

    Such a good video, really interesting! I was wondering if the code for the Neural Simulator @4:00 would be available somewhere? I would like to build such demonstrations myself but im not sure how to get there yet :)

  • @kipchickensout
    @kipchickensout 2 місяці тому

    Very nice, wouldn't say it explained it *much* better to me than the other videos but it was definitely in a format where I would've watched an hour or more xd

  • @rajbunsha8834
    @rajbunsha8834 2 місяці тому

    Why does it have so few views? Keep explaining more about ml in such fashion, you are sure to grow.

  • @HugoBossFC
    @HugoBossFC 3 місяці тому

    Nice video

  • @Adhil_parammel
    @Adhil_parammel 2 місяці тому

    Logic Gates -law of excluded middle 1/0,
    neural network-fuzzy logic.

  • @godmodedylan5563
    @godmodedylan5563 3 місяці тому

    the start was really good but you just need a way better outro for this video.

  • @midpiano3067
    @midpiano3067 2 місяці тому +2

    Seriously? All of this work have only 500 views ???

  • @NOTMEVR
    @NOTMEVR 2 місяці тому

    I've been researching this tipic for a while now trying to build one in a roblox game of logic gates and i guess this helped 🤣💀😅

  • @wawan_ikhwan
    @wawan_ikhwan 2 місяці тому

    speaking about energy comparison with real brain
    actually, i'm thinking about why the computer system nowaday doesn't use analog computing.
    i mean, neural network is tolerating inexact value, so analog technology is not a concern.
    instead, nvidia capitalism is being top currently

    • @Jackson_Zheng
      @Jackson_Zheng  2 місяці тому

      @@wawan_ikhwan dedicated ASICs by Groq are getting made that are performing almost as well. The issue is noise I think and the fact that digital components are cheap to mass manufacture and a lot smaller than analog components since they haven't had the same amount of R&D as digital components for like the last 3 decades.

    • @wawan_ikhwan
      @wawan_ikhwan 2 місяці тому

      @@Jackson_Zheng noise and inexaxt value are same, so noise is not a concern too