Neural Networks from Scratch - P.3 The Dot Product
Вставка
- Опубліковано 22 тра 2024
- Neural Networks from Scratch book: nnfs.io
NNFSiX Github: github.com/Sentdex/NNfSiX
Playlist for this series: • Neural Networks from S...
Neural Networks IN Scratch (the programming language): • Neural Networks in Scr...
Python 3 basics: pythonprogramming.net/introdu...
Intermediate Python (w/ OOP): pythonprogramming.net/introdu...
Mug link for fellow mug aficionados: amzn.to/3cKEokU
Channel membership: / @sentdex
Discord: / discord
Support the content: pythonprogramming.net/support...
Twitter: / sentdex
Instagram: / sentdex
Facebook: / pythonprogramming.net
Twitch: / sentdex
#nnfs #python #neuralnetworks
When I search how to do machine learning from scratch:
The videos: So you first do import tensor flow
Me: closes video
Me finds you tutorial series: I like this one
This is my main problem with virtually every ML tutorial on UA-cam that is not a basic introduction. They don’t explain how it works, they just tell you to import a library.
@@farenhite4329 yep exactly
@@farenhite4329 Its because they themselves dont understand how it works
yes..so true. Now a days u will find tons of tutorials about DL and ML and all of them focuses on the importing and application of diff frameworks and libraries, rather than proper deep level intuitions.
The animations are so helpful!
Glad you like em!
@@sentdex I was thinking the same thing, also they look pretty cool! What tool are you using to do them ? I have a friend who is a physics teacher and might be interested in that :)
(edit) Maaaan every time I ask a question i have to remove it because you answer it later in the vid or in the next one, awesome ^_^
@@Kawabolole he is using maim by 3b1b
This comment are so not helpful
@@sentdex How do you do this?
i knew u would upload in 7 days.
I kinda counted days left like a child.
We tried so hard to upload faster. Still trying xD
@@sentdex i think its good so we can feel the value
i second this. my only complaint is that these videos are comming out too slowly. pls help us!
@@sentdex 4 days break will be good for a video.
@@adjbutler I like the post rate right now. Really lets you mull over the new information, which will be especially helpful when we get to the more complicated parts of nn's. Especially considering Harrison is writing his book alongside making this, it's a great amount of content and builds anticipation for each release (something that lacks when I go back through his old playlists and binge them all at once) :D
The moment you mentioned the "shape" problem you became the MVP of youtube machine learning.
You clearly remember what it was like to be a first time learner and it shows in your communication style, well done.
Please don't abandon this series! UA-cam is begging for a tutorial this clear and concise about neural networks! Thankyou!
Wouldnt think of it!
@@sentdex pls pls pls finish it
@@sentdex you will make great contributions to society if you continue this series
@@sentdex are you planning to add new episodes? 🙏
@@sentdex you not only thought of it, you made it :'(
Sentdex drops a new vid. “Wife grab the kids I have a work emergency”
I, along with every other viewer following along with this series, want to thank you for making this series. Such a painless way to learn an intricate and exciting topic. The visuals are a great bonus.
My pleasure!
Never been so excited after seeing a UA-cam notification! P3 nnfs, BOOM!
Been there ! Done that ;)
Anyone who stumbled upon these lectures in 2023?
Entered the black hole of trying to do this in a programming language other than python.
Im in 2024 actuallly
Just found it myself
Man, the animations are in my opinion fundamental for the full understanding of the content. Huge thanks to Daniel who's done them.
This is possibly one of the few times I'm glad I took further maths in sixth form, because without going to uni I have covered and understood matrices and vectors in 3D
Dude this is legit one of the most helpful and intuitive coding tutorials I've ever seen! Some tutorials are really hard to watch but yours is very comprehensive thanks for that I appreciate it.
Watching this tutorial at the same time as I go through your pytorch tutorial. My head blows up of all the new things
Hah, good luck sir :D
Never been so exited for a video to come out on UA-cam.
So far these are the best video series about Neural Network. The great thing about these videos is that each time you do the same task but with a different and advanced code. For me it's the best way of teaching and I really enjoy watching your videos. Thanks!
This series has been amazing. Also, being a highly visual learner ... the animations really take things to the next level for me! Thank you both!
*The answer is of course to use loops*
Laughs in functional programming
I wish game of thrones seasons 7 and especially 8 were as good as this tutorial
The best neural network guides ever! Thank you! I always look at some videos and people are just so ignorant to explaining some 'basic' (as they think) stuff that may seem obvious to them... but if someone is starting from the scratch it is super helpful and saves TONS of time that would have to be spend to research all of it on a side. Again thanks, you're awesome
The way you highlighted the bias, weight and activation function through animation is just extraordinary, kind of enlightenment. Thank you so much. It was deeply helpful
21:00 - 22:00 the best minute I've watched on youtube since long time ago
This might just confuse some people, but it helped me: The bias is essentially just another weight, for an imaginary input whose value is always 1.0
well in my mind 1.0 represents something in its full form where any 0.x number would be something partial. so 1.0 is like saying its a full neuron. bit i have no idea what i am talking haha
@@MegaGutemusik The input is always 1 for the bias because 1 is the neutral element of multiplication. This means that you can put the bias into the weights array (usually at index 0) and therefore learn the bias alongside the weights.
The animation at the end is most helpful
Incredibly clear, amazing teacher, when you can simplify to that level means you have true mastery of your material, thank you!
I have that much joy for a while learning something with that much CLARITY on UA-cam! thank you so much for all the efforts you guys have put in to this, its awsome!
Almost a 1000 views in 30 minutes, shows how much we love this🔥❤️
this has been really helpful so far but i really need to wait until the series is done because i forget everything in-between episodes
These videos are incredible. Working through Andrew Ng's older intro to ML Course, but in Python not Octave. Not much background in linear algebra, but stronger in Python. Building from the ground up -- learning math by coding -- this is the best way to learn.
the way you break things down is super useful, i cant retain info if i have to many questions about it, my brain just locks up so all the high level explanation videos of neural networks just got me excited but didn't teach me at all. your clear enough you could just call this series,
The Understanding the "Understanding Neural Networks" Videos Series!
How is anyone down voting these? These are fantastic and animations are so incredibly helpful.
These are purely my understanding of weights and biases....
For Example,
y = x1 * w1 + x2 * w2 + b
x1, x2 => Inputs
w1, w2 => Weights
b => Biases
w1 => Denotes the contribution of x1 to the output
w2 => Denotes the contribution of x2 to the output
b => Acts as an offset...
This is just a Linear Equation,
when activation functions are applied => Non Linearity is introduced
Why we apply activation functions,
Every data can't be just explained by a Linear equation, so we apply activation functions to make them non linear.
Dumb question what does non linearity mean
@@taran7954 something that can't be fit by a straight line
@@taran7954 non (not) linear (line) means it's not a line
I'm not normally a fan of animations, but yours are clean and not too colorful, it makes it very helpful to digest the concepts that you're explaining.
Those animation are being freaking awesome and truly helping me to understand what's happening in the code.
Man I just want to watch all the videos since i have so much free time during quarantine. Might have to go to the book
Wish we could make these videos faster, doing our best :D... but yes, book should keep you busy for a while!
these animations are awesome ive been thinking that the whole time.
Thank you
@ they are very helpful to visualize the concept. Keep it up. Kudos to both of you.
@ what kind of software do you use for animation?
I seriously cannot thank you enough. I'm a java programmer but you have made this so simple to understand I'm able to implement it and I'm not getting lost. Hats off to you good sir... thank you for these videos!! You are truly doing a great Public service!
Love the teaching approach with animations. I have been learning the same subject from different sources and your approach just made some concepts I was struggling with a lot clearer.
Will you implement some sort of autograd later on in the series? Loving the videos btw
I think its worth mentioning how the calculus behind dot product works. If you have 2 matrices (in the shapes: n*p, p*m), then the resulting matrix will be the shape of n*m after the dot product (see how p = p), and the number of "columns" of the first matrix has be equal to the number of "rows" of the second matrix.
Exactly what I was going to say
except that's not really calc lmfaoo but elementary linear algebra
Finally somebody is starting explaining how NN work so anybody can understand and start building there own Ai thank you very much! You are a good teacher!
Never have I wanted the next episode more in a series. Thank you for these videos.
video_value = True
while (video_value):
print("Finally! I bought the book also!")
I would very much love the physical book, sadly my money situation is pretty much nothing at the moment, so I can't buy it...but even if I finish this series before buying the book, I still plan to purchase it at some point.
Hope your situation improves!
@@sentdex please share the link to the book
The best videos on neural networks in youtube. Simple explanation and super easy to grasp. 'Enlightenment' is the word after watching this. Thank you so very much :)
Super excited about these videos. You're excellent at explaining things, and I'm happy to preorder the book to support you creating these tutorials! Keep at it, we're all learning leaps and bounds because of you.
Thank you for the support!
I've never heard that a "list of lists" is a "lol" before, lol!
The moment I have been waiting for: "Watching sentdex's latest video" .
Great work man! Seriously, became a huge fan of your work, your way of making things understandable is one which is the most admirable...ev'rything just gets clearer if one has the understanding of basic mathematics...and if not, that's what you're there for...you videos really are a treat👏.
You deserve a medal for the best complex-topic-synthesizer. You don't even have to be a high school graduate to grasp the content in these initial 3 videos so far. Kudos bro
I'm a bit confused -- here, weights is a 3x4 matrix, and inputs is a 1x4 matrix. Strictly speaking, wouldn't the dot product only work if inputs is an n-by-m matrix (4x1 in this case), where m is the number of samples, as opposed to what's shown here? Looks like NumPy is smart enough to perform the dot product to a rank 1 vector even when the shape mismatches.
Your confusion starts @ shape. First, the input shape is not 1x4, it's of shape (4,). Also, it's not a matrix. I think you might want to watch that shape section again.
Also, you can always confirm shapes in numpy. You might want to tinker about until you feel solid at knowing something's shape.
For example:
>>> import numpy as np
>>> x = np.array([1,2,3,4])
>>> x
array([1, 2, 3, 4])
>>> x.shape
(4,)
>>> y = np.array([[1,2,3,4],[5,6,7,8]])
>>> y.shape
(2, 4)
>>>
@@sentdex Your explanation makes perfect sense!! The example really helps clarify. Thanks!
I have the Same Question
Weights is (3, 4) and inputs is (4, ), so the product becomes (3, ).Hope it helps!
In numpy when you say a vector it by default takes column vector.So input vector shape is (4,1) as per numpy even we have declared it as a row vector whose shape is (1,4).
smh, there is no spot for assembly in the github
Sounds like you need to make the assembly version! I'll wait for your PR :)
@@sentdex Lol, I would if I knew enough assembly. I'll stick to contributing in Kotlin for now. P.S: Jeez your fast at merging pull requests, I assumed it would take like a day cause you would be busy.
And just like that someone made a PR for assembly
what you're making here will be the definitive ML from scratch guide in a few years, calling it now
I do not know why would anyone dislike this very nice video,
The animation is very nice and makes the explanation clearer.
Thank you so much
to who ever wrote it in assembly on github: why, why you do dis to yourself?
Xdxdxd ......
you are called "semtex" for me and i refuse to properly read your name ever again
Sentdex, you are awesome. I love this tutorials and im saving money to buy your book. I never tought that i could learn AI for myself but now im in love with this part of the programing world
Recently subscribed to 3b1b and you. Glad to see you appreciating each others work and its feeling proud to learn from amazing teachers like you.
I am so hyped for the series. Indeed the animations are very helpful for understanding the concepts. Looking forward to watching the sexy part!
One of the most direct explanations I've ever seen... before this, the whole time I was thinking this may be too esoteric of a predictive tool for me to learn well, wow
Awesome work on the Animations Daniel. Harrison, you got heart man. The way you explain by taking so much of time ensuring that every little things are conveyed, simply amazing. Respect and ton of Thanks with all my heart.
Love your videos. Find them really helpful and love the way you've divided each part into a digestible parts. Really enjoying them and will definitely share them, just as my son-in-law shared this one with me
I am taking a machine learning class and my instructor didn't explain it the way you do and this course is what i want.
Thank you for strengthening my weakness, i'll buy your E-book next week!!
You and Danial are the best, you are just putting the learning data inside my head omg
By far one of the best videos that I have seen in ML.
THANK YOU VERY MUCH.
You are helping a ton of people (like me) who can write working code but don't quite understand the granular workings. I will buy gladly your book to glean some more details. Thank you for sharing this amigo.🙂
These videos are so clear and easy to understand it's crazy. My CS professors should aspire to be this good at explaining things.
I really appriciate you explaining the math behind this. I like to deeply understand what I do and why it works. Thank you for the video :D
This series is lovely!! It feels so good to actually understand the basic mathematics along with some practical programming. There are so may other resources that either focus completely on the mathematics part (which after a certain point start to feel like jargon) or others that just focus of using the libraries like pytorch (which begin to feel like copying and pasting after a certain point). Thanks for doing this dude!
Glad to hear you like the style!
These tutorials actually help understand the need for math more because i hated it in highschool. But now that i can see how it is applied i understand how and why to use it which makes it so much more intresting to learn. Thanks a lot for these videos!
COMING FROM ENGINEERING BACKGROUND MYSELF, I found very impressive how you use the straight line equation and visualiSation to make understand the meaning of BIAS AND WEIGHT, THANK YOU LEGEND SENTDEX
Many thanks for the videos. You are the best tutor that I have come across for deep learning. The animations help us understand the concept even better.
Looking forward for your upcoming videos.
Have been visiting your channel every day since Part 2, was not disappointed today haha
Thanks sentdex!
Glad you enjoy it!
Thanks so much for these series! I had so much trouble jumping into neural networks without understanding everything happening "under the hood" so to speak. I just always felt like I was just assembling one of those pre-designed lego sets without understanding the thought behind it.
Excellent. Crystal clear explanation of Neural networks. Thanks
LOVE THIS it really taught me about python and ML I didnt know a lot about inputs and outputs until i watched these videos thx bro
Dude this series just makes me so happy. Thanks.
Animations are super helpful for understanding the concept!! Waiting for the next part!! 🔥
It took me 10 minutes of googling to figure out how to deal with pip, but I got it in the end. Once again these videos are incredibly inspiring.
Dear Sentdex,
You are prominent. You made us everything clear now. Thanks so so so so so much 💯👍👍👍👍👏👏👏👏
i can't praise you enough, your visuals are great
you would see clearly why the problem with dimension if you revisit some of linear algebra stuff but i just dig how you are able to explain things as simple as possible, thanks man you're great
The animation really helps with learning the concepts!
Dude! Thank you so much for these videos. I have had such a hard time self learning machine learning and neural networks. I learn by seeing it applied. Thanks!
The animation really clear concept especially of weight and biases! Thank You.
I want to get started with neural networks in general and this series is really helpful in explaining, the other tutorials are much harder. You seem to explain it very well, making sure we (the viewers) understand the base stuff before getting onto the actual making of the real network
Glad you feel that way so far! Hope that continues!
Best tutorials ever! Step by step explained and it is so clear! Thank you!
The animations are so helpful! Its really nice danial
Thankyou 3Blue1Brown and Daniel and especially to you sentdex. Really apreciatte work you are putting here!!!
Finally, I started to understand what is neural networkss!!!!!Thank you for the videos
Brilliant videos! The level of instruction here is fabulous. I was going to compliment the animations but you already covered it at 23:58. Thanks for posting!!
You are doing such an amazing job. I really hope that I will have a good understanding of how neural networks actually work at the end of this series. I don't want to stupidly copy code from other people and hope that it works for my data.
I already know NN as I work with it every day, but it's awesome to see different aprouchs, loving the animations, loving the serie. Keep it coming man👌
Glad you like them!
your animation ,explanation are best man!,thanks a lot for such content
This is the best series I've seen on this topic, thank you
One of Most Beautiful study lecture ever seen. A great work. Thanks, to danial too and 3 Blue 1 Brown.
OH MY GOD I LOVE YOU! The explanation you gave for them 3D arrays was the AHAAAA!! moment for me.
Thank you so much for getting into the details. I try to learn this from other sites and they make big assumptions on what I already know.
Amazing video as always!!! Great Job!! Yes, animation makes it much more clear to understand the flow of vectors.
The animations are really helpful!
And this is some great educational stuff. Really appreciate it!
Was eagerly waiting for this video.
I checked my UA-cam so many times since past 2 days.
thanks for these videos. I am on the process of training a CNN for image recognition and understanding what is behind a lot of the activation functions and whats in the nuts and bolts of the network is very useful. looking forward to the next few videos and the final version of the book.....hopefully before i have to submit my project. this is giving me the understanding behind just following others models.