The Universal Approximation Theorem for neural networks
Вставка
- Опубліковано 1 лис 2017
- For an introduction to artificial neural networks, see Chapter 1 of my
free online book: neuralnetworksanddeeplearning....
A good series of videos on neural networks is by 3Blue1Brown. Start
here: • But what is a neural n...
This video just shows the (very simple!) basic idea of the proof. For the full proof of the universal approximation theorem, including caveats that didn't make it into this video, see Chapter 4 of my book:
neuralnetworksanddeeplearning....
This video was made as part of a larger project, on media for mathematics: / magic_paper - Наука та технологія
Everyone wants to talk about the expressive power of neural networks, but I want to talk about Michael Nielsen's expressive power to make me finally understand expressive power so powerfully and expressively.
Man I would give an arm and leg to know what amazing software he uses in this vid.... bet he coded it himself, the madman.
thats amazing, please if its available please please tell me what that is
Wow...just, wow! you explained in a little over 6 min what I've spend hours trying to understand while going through different textbooks. Thank you!
Your last video was 3 years age, and the moment that I check for a new video for a project file to be able to update, you uploaded!
Didn't know you had a channel. I started ANN with books , but your online book 2ith 5 chapter was extremely useful. Thanks for writing that book 👍
One of the most intuitive explanation for the approximation theorem, visually this makes it more accessible.
Can the rectangular blocks be thought of as the blocks in Riemann summation, more you increase the blocks, better the approximation
Very nice explanation! Can you make a video of why increasing the number of hidden layer is more efficient to approximate a function than increasing the number of neurons with only one hidden layer?
one of the best explanations I've found so far!
I can't express how much I'm impressed by this short amazing video!
Could you please tell us which software you used for these graphs/drawings?
Huge thank you for the clear and to the point explanation.
Thank you for this; This is a beautiful and simple explanation to build the intuition for universal function approximator. Could you please do a follow-on explainer detailing out caveats?
OMG THE BEST EXPLAINATION EVER THAT MAKES SENSE :O THANK YOU
Thank you for the great video.
Can I know what tool your using to visualize the neural network
I love it! Awesome explanation. The interactive and intuitive magic paper makes a great difference.
Very nice explanation!
AMAZING EXPLAINATION! Thank you tons!
Thank you for this beautiful explanation. I realized that I knew nothing about neural network mathematics.
This is great please do more
Great explanation!
Just exactly what i wanted!!!! thanks soo much Michael :)
Nice explanation 👏👏
Best video on UA-cam
Dear Michael Nielsen. Nice Video!!. I am wondering about the app you were using on the video to make graphics; so would you mind telling us the name of it?
Hey Michael ... thanks for the simple explanation! One more thing ... how to use your awesome Magic Paper program?!?! Thanks again
I remember seeing this 6 years ago and loving your explanation. What are you up to lately if you don't mind me asking?
Good stuff..!
Great video, question though: what is going on with the artificial neuron? In all my research I’ve been exposed to it using a heaviside step function activation, but this looks like it is using a smooth sigmoid activation or something?
Very, very cool.
great job
Hope my professors can make things as simple as you does to understand!
Intuitive and simple!
Thank you!
Does it work for Recurrent Networks?
Hey there, I just managed to install Chalktalk and I was wondering if you would send me your template? I'm having a presentation about ANNs soon and I would be really thankful to have an illustration like yours for an introduction. I would of course give you credits for it. Best regards!
(Btw.: my topic is "Radial Basis Activation functions, so I would make sure to use them instead of the sigmoidal type)
Sir plz start classes teach More aabt quantum computing... And also source for solution for the problems of ur book😊
what is the software u duplicate neurons in?
Isn't the proper terminology for a "tower" function that a sigmoid can 'collapse' into a unit step or 'Heaviside' function?
What is the software that you were using in the lecture, which seems amazing.
Thank you, this was very clear!
slayed, thank you so much!!
wow! can you tell me one thing that why increasing the number of neurons will increase the accuracy of approximation?
What GUI are you using for the neat squares and circles and stuff? Could be useful if code is available for making ODE compartment models.
Clear explanation. thanks.
I don't know the function that has to be approximated but, I have a data set "input-output", let us say the pair [x,f(x)]. By using a trained NN I can find the best weights to approximate the unknown f(x) minimizing as much as possible the sum of the square errors...but then, if I need to use the just built trained NN to find the output of a new input, what should I do? Does a numerical simple example exist to show the full process? Thanks for your clarification
The end result of the trained NN can be stored as matrices or Python pickle file or R object. When you want to get the prediction for the new input, just pass the data through the NN.
Can anyone tell me what software he is using?
Great ideas about math and how it could be more dynamic. For pedagogic means I agree. For "production" and cooperation I do not (yet). As a CS person I think more fields should use Git to have all the benefits that come with it. I think there do not exist good tools for corporation on videos (corporation on code which generates video seems even more of a mental burden than normal math).
Awesome! What program are you using during this?
It's described here: cognitivemedium.com/magic_paper/
Michael Nielsen thanks!
that magic paper is impressive
@@MichaelNielsen amazing
@@MichaelNielsen Thank you so much for this video, so well explained!
The app blew my mind as well. Is it available for download at all?
I love your voice
What tool is this.... that is amazing
How to implement in simulink ????
what is a linear neuron?
What App are you using for visualisation?
How is the software you use to draw called?
What's the software/program used for this? Thank you for the great video.
I have the same little question. The tools used in this video is absolutely going to change online classes
One of the greatest mathematicians and one of the most gifted teachers in modern time! Huge props!
Very cool demonstration.
But, isn't this basically overfitting with N free parameters?
N is here: en.wikipedia.org/wiki/Universal_approximation_theorem
what program is that?
That is crazy.. and beautiful... love you
ECE 449
👏👏👏
This is backwards. UAT is a polynomial theorem and the NN has shown to be capable of incorporating that Theorem
This guy teaching you with closed eyes
Why his eyes are closed?
cOOOOOOOOOOOOOOOOOOOOOOOOL !!!!!!!!!!!!!!!!!!
Does anyone have a link /reference to a better explanation?
You are genius 😊😊
It's not a theorem, it's a model.
video is like a sleeping pill to me