How Neural Networks Actually Work (Explained Interactively)
Вставка
- Опубліковано 21 жов 2024
- What is a neural network exactly? Where did it come from?
In this video I will show you the answers to those questions and the surprising link between them and computer logic gates.
Timestamps:
Help this channel grow by supporting me on Patreon:
/ jacksonzheng
Finally found someone with an amazing illustrative, ingenious educational conveying skills with great voice too lol. Thank you so much, just un-fried my brain watching this after trying to find this specific logic gate NN concept out there for days, brilliant presentation and thank you for sharing!
This is the better and simplest explanation of how a neural network works i have ever seen, thanks!
Nice video Jackson, your editing is so good now!
@@joshhunt4431 Thanks man
clicked on this video by accident, but I have to say this is one of the best videos i have ever seen outlining logic gates and neural networks. Keep it up man, your stuff is going to take off soon if you keep outputting this type of quality content that anyone in the field or not can find value out of.
woah great content man...thought for a sec you were one of those million subs channel...keep up the quality!
How does this have only 28 views? This is great content bro. Keep it up.
@@cipher_angel I know right? 🤣 Took me ages to programmatically animate everything in Manim too. Guess it might just be the titles and thumbnails.
@@Jackson_Zheng Impressive work sir. It'll pay off.
Brilliant stuff man!!
@@ashali2226 Thank you!
Bro keep up the good work. It takes time...
I liked this, very clear. Subscribed.
underrated video
this channel is hidden gold
Instead of weight there is mylin thickness in neuron to determine signal strength. neuron activation is based on criticality of signal strength accumulation.
You should have used Op amps to model ANN's because ANN's are analog systems. Logic gates don't work because 1. You can't model 'weights' on the input signals. 2. The output signal swings fully high or low (example 3.3V to 0V) when input thresholds are met at typically 0.7*Vin for high for and 0.3*Vin for low, there is no in-between. This behavior won't work as an activation function because you need continuity with some gradual slope in between the high and low transition for information to adequately propagate through the network to function as an ANN, or else it would just be an ordinary combinational digital circuit. Op amps meet all the ANN criteria when you model each unit with negative feedback; the feedback resistors being the weights of each input neuron. The outputs of the opamps will be similar to the ReLU function.
idk what it is but smth about the way you speak makes me think of some british guy telling a friend at the bar about his day
😂
Such a good video, really interesting! I was wondering if the code for the Neural Simulator @4:00 would be available somewhere? I would like to build such demonstrations myself but im not sure how to get there yet :)
Very nice, wouldn't say it explained it *much* better to me than the other videos but it was definitely in a format where I would've watched an hour or more xd
Why does it have so few views? Keep explaining more about ml in such fashion, you are sure to grow.
Nice video
Logic Gates -law of excluded middle 1/0,
neural network-fuzzy logic.
the start was really good but you just need a way better outro for this video.
@@godmodedylan5563 Noted
Seriously? All of this work have only 500 views ???
@@midpiano3067 😭
I've been researching this tipic for a while now trying to build one in a roblox game of logic gates and i guess this helped 🤣💀😅
speaking about energy comparison with real brain
actually, i'm thinking about why the computer system nowaday doesn't use analog computing.
i mean, neural network is tolerating inexact value, so analog technology is not a concern.
instead, nvidia capitalism is being top currently
@@wawan_ikhwan dedicated ASICs by Groq are getting made that are performing almost as well. The issue is noise I think and the fact that digital components are cheap to mass manufacture and a lot smaller than analog components since they haven't had the same amount of R&D as digital components for like the last 3 decades.
@@Jackson_Zheng noise and inexaxt value are same, so noise is not a concern too