Hey everyone, at 20:17 it should be -= costGradientW (the minus sign is missing). I somehow managed to delete it while formatting the code for the recording! Thanks for letting me know in the comments.
this video is even a better explanation of NN than the video from 3b1b and i thought his video was already perfect. thanks for creating this video because now i'm able to understand it much better.
Completely agreed! He actually walked through EVERY STEP (I can think of right now)! Which is great! One long video is soooo much better than scowering for dozens of bad ones (from experience)!
I am a software engineer and I've always wanted to learn machine learning, being able to code is not a problem for me but being terrible at maths and statistics makes it hard to get accustomed to all the terms and concepts. I've tried multiple courses from different platforms and instructors and all of them try to teach you "what" to do instead of "why" to do it. I personally find learning more intuitive when I know why am I doing something instead of blindly following steps. This video is exactly the type of introduction to ML that I've always wished for, The explanations are highly intuitive and most importantly visual, there are no assumptions and no brushing over concepts, Nothings being done just for the sake of it, Everything is explained in simple language. I admit I'll still have to watch this video a couple of more times to make full sense of it because its jam-packed with so much information. You'll probably miss this comment in the sea of other comments (although I hope note) but I genuinely want to thank you, This video has relit my once dead interest in ML, I would love to see more videos from you on this topic or least get some recommendation on where I can learn more.
Exactly what i wanted to say , This is the first time , I actually got to learn about the math behind machine learning and not just working with some ml library and get to know the true essence behind it. Great videos as always
The story of not liking doing something just in the sake of doing this very something without thorough explanation is literally my school story with trigonometry and derivatives :D I couldn't understand it back then, but no one cared to explain instead just telling me to focus on the process itself.
Exact same situation here. Coding is fine. Maths....not so much. Certainly a bit of a lack of talent but school wasn't too helpful with the usual attempt of explaining the what without the why. @Sebastian, your videos are by far the best source for many of the topics i'm engaging in.
He's not underrated, his content is stellar and known in the community for quite some time. He has almost a million subscribers, and everybody knowledgeable in Unity forums knows about his channel. Besides, in my opinion, after years of seeing all kinds of videos on game dev, Sebastian's are still one of the best, even old ones. And I'm not talking about just the style of presentation, I mean this strictly technically, his advice and his code are next level. Also I'm not young (as a human or as a game developer, the age is similar), and if I can learn a thing or two from him, and occasionally I do, that's instant five stars from me. Maybe he could become even more popular (though popularity comes with its own pressures and costs), but he's definitely not underrated.
@@milanstevic8424 I am very sorry for abusing underrated in this manner, it’s just that, in my personal opinion (which is often wrong), concepts in his videos are often explained so thoroughly, yet he makes is so easy to understand, that I don’t see why more people love his content. And I know that CS isn’t for everyone, but *just* (I’m using just very loosely here, this is no small accomplishment) 1 million subscribers for videos unrivalled for this amount of quality is, as I said earlier, in my opinion, underrated. Sorry and I hope you understand, I might still be wrong Edit: to avoid confusion, I have removed underrated from the original comment
@@rienkthegamer5422 Maybe I'm just picky with the words. This channel is definitely not as popular as, for example, Brackeys (although that channel is officially dead for a year or so). But to say it's underrated is something else. Maybe you thought like underappreciated by the general population / not recommended enough by the UA-cam algorithm, and I can agree to an extent. But let's ponder instead, whether such front-page popularity a good thing. Those who do got to see his content, would definitely recommend it, and in that sense it is not underrated. His channel is constantly growing, regardless of the algorithm. I've seen people recommend Seb's videos to people who cannot grasp even the most fundamental concepts, so that's why I've said that popularity isn't always the best thing. His videos and experiments are very smart and sometimes beautiful, but you still want the audience to understand the beauty and the effort behind it. If you cultivate a wrong culture here (for example, children screaming that they get errors for the most mundane things and nobody understanding a word of what he says) his videos and his enthusiasm would certainly be affected by this. So, with all that in mind, maybe "underrated" is a good thing? I definitely would not call 1M a bad subscribership. That's gold on UA-cam. Btw, now that you changed your message, I agree 100%. I'm probably just picky with the words.
The production quality is off the charts. As a software developer I can't even imagine the amount of hard work, research, technical knowledge, expertise, patience and determination this must have required. Hats off to you :)
everything else aside, can we all appreciate the incredible visuals all of Sebastian's videos have? the animated graphs, the visualizations, the explanations, its so pleasing :)
School's advertising is such a complex topic, but it's simply because they don't teach it well. I was able to learn it in eighth grade while I was struggling with 10th grade math. Schools are bad at teaching but don't let that hold you back from accomplishing what you want.
You're probably not gonna see this message, but I want you to know you give me inspiration and motivation to learn, do and achieve more as i believe you do for many others
my life would be very different if I hadn't came across his coding adventure playlist, I got into game dev and computer graphics because of Sebastian and I fall in love every day with the beauty of things in this realm
25:18 OMG, and at this point I've completely realized the true nature of the derivative - why it becomes a slope function, why x^2 turns into 2x and so on. It was one of those mind-blowing moments of insight, which most of us have experienced at least once in our life. Thank you, Sebastian!
I haven't even learned calculus and this was my first introduction to it. To my surprise, I understood it perfectly and I'm probably gonna start self studying calculus before I study it in school.
I had the same when he was just going about adding and multiplying many values together, when it struck me; addition (and thus multiplication) is computationally A LOT less expensive than division, so adding/multiplying tens of numbers can be as fast if not faster than division. That's probably the reason behind all the ridiculous math - to speed up everything by doing what computers are best at: addition, so it can be optimized (in terms of algorhytms) furthermore without major performance hits.
It’s such a bittersweet moment watching your videos as they’ve recently come out because I know such great content with such level of detail takes so long to produce and it’s going to be a long and sad time until your next video comes out. I just love your videos man, everything you touch becomes gold, you make so many topics that are so boringly taught at uni seem sooooo exciting!
I cannot believe how lucky I am (how ALL of us are) that you sir exist. This quality, effort and precision put into these videos... Just wow... Thank you!
Sebastian Lague, your videos might go down in history as the most well-produced educational videos on UA-cam of all time. And it’s hard to even say that something beats 3B1B’s videos.
@@jaideepshekhar4621 he creates videos that no one understands even PhD levels struggle with his explanations, so it's everything about the production. His videos are very very high level Physics.
I love every single video that you make, you make such amazing stuff! I wish you could bring back the "How computers work" series which made me discover my passion ★
I also really liked the how computers work series .It is definitely one of the fundamentals of computing I knew the least about ,along with many other people I assume .
In the mean while, if you havent already check out Ben eater's video's! He has some great explanations from transistor level logic gates, to how these can be used to create all building blocks of a basic processor and how it handles assembly code
Having worked extensively with neural networks some 10 years ago, I must say this is hands-down the best explanation I have seen for people who are new to it. Excellent visualisations and explanations. It is so great for someone to start working on the absolute basics (simple perceptron) and working up, instead of directly going to PyTorch of TensorFlow.
I'm in awe of how good this content is. Production values are superb: voice is really easy to listen to with clear diction and pleasing accent. Graphics clear, smooth, and helpful. Content is out of this world. The explanations are great. I've not seen someone cover the whole thing virtually from scratch and yet at no stage does it feel like we're getting bogged down.
Ok, the way you explained derivatives at 25:00 makes soooo much more sense. It perfectly presents why first derivative of x^2+ax+b is 2x+a. I'm now so angry this wasn't explained to me properly when I was in university. I also believe you are secretly a maths professor.
You can keep at most 4 new things in your brain at a time. Perhaps your lecturer explained tangents, secants, slopes, limits, derivative limit definition, specific derivative calculations, derivative dx notation, derivative prime notation, ect in too few lessons. Now that you are familiar with these things it is much easier to follow.
This is probably the best (99.5% confident, 0.02% confident that it is second best) video/tutorial on neural networks I have seen. Your intuitive explanations coupled with showing the naive to optimal approaches is very effective. Definitely showing this at my school's computer science club in September!
Easily the best vid on neural network. I have been looking for a good explanation on haw neural networks work, but none of them explained it in such understandable way. THANK YOU SO MUCH
I couldn't use any other word besides perfect to describe this video. The visuals, explanations and overall progression were exceptional and on point, every concept contributed to the next one, nothing was rushed or under/overly exposed. I wish more people have the opportunity to find your amazing channel, because these videos are truly special.
This is the most comprehensive neural network video on UA-cam. I have been tinkering with toy networks just for fun, and I am interested mostly in whys and hows of these systems. In this single video you gave me all the answers I have been looking for. Thank you very much.
You're such a great educator/communicator. You explained gradient descent in a way that makes simple intuitive sense to me in about 2 minutes, whereas it's something I've only understood abstractly in over 15 years of trying to understand this stuff!
Just found this video again. This is the video my team and I watched before going to our first coding competition. We have come so far. This is where we started and we‘re at programming networks such as yolo, rcnn, … now. damn. thank you 🙏
I've always wanted a great demonstration of programming in a similar way someone shows progress of machining something on a lathe/mill/forge/woodworking. Most other videos skip over "all the boring bits" when they get to some coding, but you have absolutely nailed it ❤
Might you be talking about "Stuff Made Here"? It's my only complaint about that channel. I mean, yeah, I'm a programmer so maybe I'm more interested in the programming bits than the average person, and I can also understand it can be difficult to coherently show code in an edited video. But a lot of his projects are something like 50% or more code, and he's a wicked smart programmer, so I wish he included more of it. On the extreme end of "demonstrating programming," Handmade Hero is a (very) long running series of originally-streamed videos where a professional systems game dev builds a modern, professional game/engine from scratch in C. (Really it's C++ but it's a very C-like subset of features.) Even if you don't watch the whole thing, the first ~20-30 videos are enormously enlightening w/r/t the fundamentals of lower level application programming. (e.g., I now know how you can use the Win32 API to open windows, handle keyboard/controller input, play sound, and so on.) Near the end of that first streak of Windows-specific videos he shows off some truly remarkable techniques for custom hot reloading and input recording/playback.
@@tz4601 661 videos with each video nearly 1 hour starting from 7 years ago. Just respect. I had the intention to build a game engine with C but then I realized it is much harder to write in pure C and I didnt wanna learn C++. I also realized that I was just motivated by all the youtube videos where people code their games/engines from scratch and had little idea how one works. So I dropped the engine idea for now (planning for later) and started to build a terminal text file editor with C. Even that simple project had gotten me so far (far in my own terms) that I had become comfortable with libraries, compilers, linking them, how to use them searching for old documentations etc. I dont know where I am going with explaining these lol. As for the recommendation you gave, thanks. I probably wont watch it because it is hella long and I like to learn hands on with trial and error, searching for problems on my own. But if I ever start to code an engine and I have a problem that I cant solve no matter what this will probably be my go to source.
This video is so beautiful. I'm working on a college project with Neural Networks, and you, sir, have given me such deep insight into how one must approach the same. Thank you and may God bless you
You have never made a bad video. My favourite video in the notification box updates every time you upload, from your last video to this one. Anyway, point being, you make my favourite videos on this platform.
This Channel has been like Bob Ross for me. I may have very little idea how to code like him, but I love watching how excited he gets over his projects.
Great video! I'm a senior college student studying computer science at my local university. I'm taken a few machine learning classes, and this kinda of stuff really fascinates me. I really enjoy the explanation about derivatives and how they fit in to the design of a neural network. Theres a couple things you could possibly do to further improve the performance on the datasets. One thing you could do is known as cross validation. The idea with cross validation is that you take the training data and further split it into some number of "folds", where you train on each fold except for one and validate the average training score on the last fold, and do this for each set of folds. This is turn makes the predictions must more robust and prevents overfitting. Another thing you could do is known as grid search, where you test the various parameters and functions (known as hyperparameters in ML speak), training the model for each combination of parameters, and picking the best performing model. This is very brute force (and as far as I know there is no easy way to optimize it), so bare that in mind. You can also combine these two techniques, doing Cross Validation on each version of the model in the grid search. This however can be very computationally taxing (especially for a neural network), but in some cases you can improve the test scores. One last thing you can do is apply a scaling function to the inputs before you train the model. This is because neural networks tend to perform better when the input values are close to each other. (Don't ask me how, because frankly I don't know). For example with the handwritten digits, all of the pixel values are between 0 and 255, so you could simply divide all the pixel values by 255 to get them to be between 0 and 1. This will be a bit more complicated for color images, though. I haven't had too much experience working with neural networks, so I'm not completely sure if these techniques will work for you. I did however do a final project one semester where we trained a neural network to recognize the hand written digits you showcased. We did a similar thing to what I described above and got some good results. Our best test score ended up being 98.2% But honestly, the neural network you designed is doing really good already!
This is actually an amazing explanation of backpropagation, good job! I would recommend to anyone who wants to learn backpropagation to just take a piece of paper and a pencil, and derive those formulas themselves. Trust me, it helped me immensly when I tried to learn backpropagation a few months ago
This is by far the best explanation I have ever seen of Neural networks, not because it is particularly precise, but because it does a fantastic job at making the concepts of nodes, weights, biases, activation functions, gradient descent seem very intuitive, which is rare for such a theoretical subject. Outstanding job!
I never understood maths until I started attempting basic code. These videos are the first ones that actually make sense because I can properly visualize the maths.
Dude just explained multivariable calculus + machine learning and genetic algorithms very comprehensively, extremely well written script. Keep up the great work
I am astonished by the cavalier way that Sebastian charges through Calculus and Differential Equations and somehow manages to make it clear and understandable. It's very impressive to see someone take topics as deep and difficult as these and be able to extract exactly the amount of information needed to illustrate the point clearly without getting bogged down. Pretty cool to watch for that reason alone.
Thanks! I have my own library, although that’s a generous term to be honest, it’s more like a loose collection of scripts scattered across several projects (which I always have to hunt down and repurpose for whatever I’m working on). Really need to improve that some day! I have recently been experimenting with Freya’s (brilliant) Shapes plugin for rendering lines and points though, so some of the graphs and a few other things in this video are using that.
@@SebastianLague Been a huge fan for a long time! Would you be able to share more about your library? I've used manim a lot but I'd be very interested to see how you create your animations, even if you only show a small bit of it.
It's awesome how, after even having classes in college about neural networks, I finally understood how they work *in practice*. I studied the theory and saw a lot of "for dummies" explanations about NNs, but people usually abstract the actual code from their explanations and this used to frustrate me a lot. Thank you so much for the explanations, Sebastian; your content is gold.
I've been working on a basic neural network just because I wanted to learn this, I already knew the basics but it' so cool to see you working on these. This really helped cement the knowledge I learned.
This is far and away the best explanation of NN I've ever seen, and I took college courses on NN's. The included explanation of the calculus behind it was amazing as well
Any reasonable person: “I will use one of the many existing Python libraries that implement backpropagation for me in an efficient and easy-to-use way” Sebastian: “I like writing C#” Jokes aside, very cool and impressive project as always. And of course presented in a stunning and intuitive fashion, keep ‘em coming!
“I will use one of the many existing Python libraries that implement backpropagation for me in an efficient and easy-to-use way” I hate that its like 99% of online courses
@@notahotshot That's great until you need to heavily customize a wagon for a new kind of task, but those old wheels somehow don't work well with it and you have no idea why, because you never learned how these wheels even work. This is a very common problem in programming in recent years. There are so many programmers now who never learned basics (eg. they often don't even consider they operate on physical hardware with actual memory), new apps with the same UI and functionality we were using 20+ year ago start lagging horribly on a 10,000x faster computer than what we had back then. Oh and you better have a few GB of memory for those nice fonts...
I've spent many hours learning about ML algorithms, with a lot of that time spent on MLPs. I've working with PyTorch and TensorFlow before. I thought I understood everything pretty thoroughly but just that first example of manually tweaking weights and biases and seeing how those affect the output graph showed me something new. You're videos are incredible and inspire a whole generation of programmers.
This is gold. I've seen many videos in my quest to fully understand this stuff. I have never seen such an intuitive video on this topic, ever. And in just 54 minutes, you clearly walked us through weights and biases, non linearity purposes, gradient descent, and calculus intuition! Yeah I've just subscribed, keep being a legend.
i've had trouble for 5 years learning derivatives, dropped out of universities cause no one would explain to me why we use them, just how. Your channel is a blessing
That's a really nice way of introducing the neural networks. Explaining the different parameters and fiddling with them by hand + the visualization, before starting to explain how we can make our computer do the fiddling. Very cool idea!
As someone who watched a lot of these videos when writing my Bachelor's thesis on NNUEs (a specific kind of neural network for chess), I can safely say this is the best introduction to neural networks I've seen. Absolutely love all the visualization, how you start from the ground up but still include the calculus etc. Fun fact: my thesis was somewhat inspired by your Chess Engine video as well. I love your content, becoming a patreon now!
I wanted to write my bachelor thesis on chess algorithms, but my professor told me it was too complicated… would you mind sharing your work? I would love to check it out!!
THANK YOU and I'm Impressed. I'm an old school, Career “System's Analyst" AKA "Software Engineer". My Coding skills are extensive. BUT THIS - I've been working on this on my own (and considering going back to college for refresher courses targeting this subject matter). . I JUST Needed to TELL YOU - This video was a big help for the holes in research... especially in explanations of "how" and "when" to adjust "what" weight. . THE BEST PARTS; (1) Now days, Software I work on is written in C++, (from Assembly, COBOL, Modula II, etc, days). I read and understand Code better than I read understand explanations in Books 😂LOL! Thank you for explanations in C# code. . (2) Your demonstrations of the LESS Efficient way, explains MORE, prior to your demonstration of the more Efficient (e.g. Cost)). THANK you for including things like that during this video.
I know you're not gonna see this message, but I hope you know that what you do is amazing. I am still in highschool, haven't learned calculus yet, and even so, your explanation of calculus just made so much sense to me. Everytime I tried to learn calculus I could not wrap my head around what derivatives were or what it is meant to do, but I understood it in a few minutes just from your example! Thank you, Sebastian.
One of the best tutorials I have come across for ANNs in the 2-3 years I've been working on them. Very intuitive, bravo. You must really know your stuff.
this video was so entertaining and i actually come from syria and my english is pretty bad but this video was so well explained i didnt even have to rewind the video, thank you very much for these videos and i hope you keep making such amazing and educative videos. PLEASE keep making these videos, i never get bored of this even thought i don't use this for something useful.
Sebastian :"Just open your favorite code editor, type in a few lines and there is your multiverse simulator with special effects" Always a treat to watch your "magic" man :)
From just the first couple of minutes of your video, I was able to code my own working classifier. I love you build up everything from first principles and also show your first "naive" implementations before moving on to the more optimized versions. It really makes all the moving parts easy to understand. Excellent work!
The caliber, quality and attention to detail in your videos is outstanding. I gained a better, and more fundamental, understanding about neural networks from watching this video that studying them for a whole semester at university. Thank you!
Sebastian... What an absolutely wonderfully calm and suthing voice you have. If coding doesn't work out for you, I'm sure you could get a job narrating stuff.
As per usual your videos always fall in to one of two categories for me: 1. A calming exploration of programming explained with intuitive examples and metaphors or 2. Black Magic explained with intuitive examples and metaphors. Either way, it has never been an easier choice to support someone's Patreon. You're a gem.
@Sebastian Lauge Thanks for another masterpiece of clear illustrations and explanations. And just in case that it somehow is not absolutely painfully obvious, we all massively enjoy Your "ridiculously long" videos. Not to mention the absolutely crazy amount of work You put into making them both interesting and clear. Kudos and best regards.
I wrapped up my MS in CS focusing on ML and CV last year, and I could have replaced hours of boring lectures with just this one video... especially when digging into neural networks. I'd have needed to know all the activiation functions and other details, but this is a great introduction to the concept done in an entertaining way.
Convolutional NNs are the key to creating good image-recognition networks (at least at the moment). It's pretty amazing how a couple of straightforward image transformations add so much information and allows CNNs to gain such an edge over traditional feed-forward networks. In fact, these convolution-like processes are even used by humans, such as line detection. Your videos and explanations are incredible, I would love to see you dive deeper into the amazing field of ML!
I wonder if a simple luminance/chrominance encoding of images could already significantly improve the performance of a simple networks such as this one for the last exercise.
@@Flobyby unlikely. Maybe, for really small networks it might, but in general feature engineering (manually preprocessing the data) is almost always useless for neural networks, because the network can learn the best transformations for each particular task on its own. Actually, that's kind of the whole point of deep learning - to avoid having to manually hard code the features.
This video is probably the best video about neural network and machine learning I have ever seen. Everything from the calculus to the neural network itself was really well explained.
It's a great video and I really appreciate the effort you put into visualizing those concepts :) I made a little note 6:50, you said that it doesn't make sense to change the size of input or output. For neural networks it would be impractical, yeah. However, for many other classifiers you can increase the number of inputs, that includes making logistic model to have a "bendy" decision boundary. You may for example add an input and make it be a nonlinear function of another input, like input one squared, sine of input two, euclidean distance between a data point and 0,0. Then you can train linear model on this augmented data and it will be able to have bendy decision boundary in input space. That's what eg. SVMs use and it's called a kernel trick - making a nonlinear problem a linear problem in nonlinear space
Amazingly well formatted and highly informational video, went from near 0 experience in Neural Networks to understanding to making my first Network in MINUTES!!!
Did I just watch an hour long calculus course and not only i think I understood everything and going to try on my own but wanted it to last 2 hours longer? My god this channel never fail to impress me by its quality.
Lovely video as always! In your doodle of the helicopter problem, you might be testing it with a thinner line (or less opaque) than the training set. Thus if the intencity of the line is affecting the activation of the neurons, you might not get the correct result. This could be the case for the numbers as well. The training set with numbers seems to be drawn with a thicker line than the one you use to test it with.
The learn rate part reminded me of damped harmonic oscillation. For a damping coefficient that's too low it will start to oscillate around the steady state, if it's too high it will decrease but not very fast, but if it's critically damped it will return as fast as possible to the bottom.
Thank you so much for this! I think this is one of the first neural network videos that are fun and game-oriented, yet don't just gloss over the hidden layer and go "yeah it's machine magic it works" and then only focuses on the output and input. Very nice, helped a lot in grasping the concept. Please make more about other AI algorithms :D
Hi sebastian Lague, I have just started Calculus in High School, and was confused on the application of this math that I was learning, but I genuinely think that this is one of the best explanations of Calculus I have ever seen, as it actually gives a way to use it. Thank you so much I really appreciate it.
Hey man I think you would really find plate tectonics simulations interesting. It kind of fits well with your procedural terrain generation series and is extremely interesting to read about.
As someone who recently studied a Data Science course, I guarantee you that you taught it way better than the professor. The visualizations, actual code, and your general calm state of mind and the pace you keep is amazing. Really looking forward to seeing more videos about advanced computer science concepts!
i feel like you have a natural talent for explaining things as optimally as possible.. There just enough information to not overload or skip through too much important stuff... and its not just this video, its all your videos.
Hey everyone, at 20:17 it should be -= costGradientW (the minus sign is missing). I somehow managed to delete it while formatting the code for the recording! Thanks for letting me know in the comments.
this video is even a better explanation of NN than the video from 3b1b and i thought his video was already perfect. thanks for creating this video because now i'm able to understand it much better.
I was Always Waiting for Your Next Video , Finally Something to Watch and Explore ! #thanksForVideos its really helpful explanation with real examples
youre so underrated buddy
@@multiarray2320 ez
Fac card so c
As someone who has some experience with machine learning i can say this has to be the most intuitive explanation i have ever seen
indeed. this is amazing
That is exactly what I was going to say
As someone who has zero experience with ML/NN, I second this. What a great intro into the NN topic!
The only thing missing is matrix math (I find it much more intuitive for some reason but I may be weird)
Completely agreed! He actually walked through EVERY STEP (I can think of right now)! Which is great! One long video is soooo much better than scowering for dozens of bad ones (from experience)!
I am a software engineer and I've always wanted to learn machine learning, being able to code is not a problem for me but being terrible at maths and statistics makes it hard to get accustomed to all the terms and concepts.
I've tried multiple courses from different platforms and instructors and all of them try to teach you "what" to do instead of "why" to do it. I personally find learning more intuitive when I know why am I doing something instead of blindly following steps.
This video is exactly the type of introduction to ML that I've always wished for, The explanations are highly intuitive and most importantly visual, there are no assumptions and no brushing over concepts, Nothings being done just for the sake of it, Everything is explained in simple language. I admit I'll still have to watch this video a couple of more times to make full sense of it because its jam-packed with so much information.
You'll probably miss this comment in the sea of other comments (although I hope note) but I genuinely want to thank you, This video has relit my once dead interest in ML, I would love to see more videos from you on this topic or least get some recommendation on where I can learn more.
Thank you, that’s wonderful to hear!
Exactly what i wanted to say , This is the first time , I actually got to learn about the math behind machine learning and not just working with some ml library and get to know the true essence behind it.
Great videos as always
The story of not liking doing something just in the sake of doing this very something without thorough explanation is literally my school story with trigonometry and derivatives :D I couldn't understand it back then, but no one cared to explain instead just telling me to focus on the process itself.
Exact same situation here. Coding is fine. Maths....not so much. Certainly a bit of a lack of talent but school wasn't too helpful with the usual attempt of explaining the what without the why. @Sebastian, your videos are by far the best source for many of the topics i'm engaging in.
My exact sentiment also!
I’ve never had someone explain calculus so intuitively.
The quality of this content is absolutely incredible.
He's not underrated, his content is stellar and known in the community for quite some time. He has almost a million subscribers, and everybody knowledgeable in Unity forums knows about his channel. Besides, in my opinion, after years of seeing all kinds of videos on game dev, Sebastian's are still one of the best, even old ones. And I'm not talking about just the style of presentation, I mean this strictly technically, his advice and his code are next level. Also I'm not young (as a human or as a game developer, the age is similar), and if I can learn a thing or two from him, and occasionally I do, that's instant five stars from me. Maybe he could become even more popular (though popularity comes with its own pressures and costs), but he's definitely not underrated.
its pretty surprising that they don't teach differentiation by first principles in some places
@@milanstevic8424 I am very sorry for abusing underrated in this manner, it’s just that, in my personal opinion (which is often wrong), concepts in his videos are often explained so thoroughly, yet he makes is so easy to understand, that I don’t see why more people love his content. And I know that CS isn’t for everyone, but *just* (I’m using just very loosely here, this is no small accomplishment) 1 million subscribers for videos unrivalled for this amount of quality is, as I said earlier, in my opinion, underrated.
Sorry and I hope you understand, I might still be wrong
Edit: to avoid confusion, I have removed underrated from the original comment
@@rienkthegamer5422 Maybe I'm just picky with the words. This channel is definitely not as popular as, for example, Brackeys (although that channel is officially dead for a year or so). But to say it's underrated is something else. Maybe you thought like underappreciated by the general population / not recommended enough by the UA-cam algorithm, and I can agree to an extent.
But let's ponder instead, whether such front-page popularity a good thing.
Those who do got to see his content, would definitely recommend it, and in that sense it is not underrated. His channel is constantly growing, regardless of the algorithm. I've seen people recommend Seb's videos to people who cannot grasp even the most fundamental concepts, so that's why I've said that popularity isn't always the best thing. His videos and experiments are very smart and sometimes beautiful, but you still want the audience to understand the beauty and the effort behind it.
If you cultivate a wrong culture here (for example, children screaming that they get errors for the most mundane things and nobody understanding a word of what he says) his videos and his enthusiasm would certainly be affected by this.
So, with all that in mind, maybe "underrated" is a good thing? I definitely would not call 1M a bad subscribership. That's gold on UA-cam.
Btw, now that you changed your message, I agree 100%. I'm probably just picky with the words.
Well then prob u dont watch enuff maths related vid, there are other talented maths teacher on youtube, saying like that is an injustice for them
The production quality is off the charts. As a software developer I can't even imagine the amount of hard work, research, technical knowledge, expertise, patience and determination this must have required. Hats off to you :)
Thank you!
everything else aside, can we all appreciate the incredible visuals all of Sebastian's videos have? the animated graphs, the visualizations, the explanations, its so pleasing :)
Yeah, Its so smoooooth 😍
I certainly appreciate it! Making good visuals is not easy!
@@jitin4179 He shared a link to the source code on his community page.
I'm guessing it's a Unity3D framework he made for doing visualizations in the engine.
@@To-mos Only Losers call it unity3d.
I wish somebody had explained calculus like this in school. Intuitive, descriptive, visual, simple, elegant. This content is marvellous.
School's advertising is such a complex topic, but it's simply because they don't teach it well. I was able to learn it in eighth grade while I was struggling with 10th grade math. Schools are bad at teaching but don't let that hold you back from accomplishing what you want.
Mfw Sebastien tricked me into learning calculus
We did get taught almost exactly like this in school. Most people didn't get it anyway.
You're probably not gonna see this message, but I want you to know you give me inspiration and motivation to learn, do and achieve more as i believe you do for many others
I’m very happy to hear that, thank you!
True. He is the only channel I turn thr bell 🔔 on. It’s always worthy to watch the video
@@zeddoes Same here, I'm subbed to close to 100 channels and this is the only one I have the bell notification on.
my life would be very different if I hadn't came across his coding adventure playlist, I got into game dev and computer graphics because of Sebastian and I fall in love every day with the beauty of things in this realm
I love how this channel focuses on quality over quantity, shame the youtube algorithm doesnt promote more of that
I am working full time as an ML Engineer.... I wish I could go back in time and watch this video when I was in college.... Top quality video!
"I'm bad at naming things"
There are only 2 hard problems in computer science: cache invalidation, naming variables, and off-by-one errors
Segmentation faults 💀
I see what you did there XD
dang
Truth.
0, 1, 2. Thank goodness, I thought there was an off-by-one error there
Sebastian: "The End"
Neural Network: "Oh, that's for sure a tractor"
25:18 OMG, and at this point I've completely realized the true nature of the derivative - why it becomes a slope function, why x^2 turns into 2x and so on.
It was one of those mind-blowing moments of insight, which most of us have experienced at least once in our life.
Thank you, Sebastian!
I haven't even learned calculus and this was my first introduction to it. To my surprise, I understood it perfectly and I'm probably gonna start self studying calculus before I study it in school.
same
The word you are looking for is epiphany.
@@To-mos Ah yes, thank you.
I had the same when he was just going about adding and multiplying many values together, when it struck me; addition (and thus multiplication) is computationally A LOT less expensive than division, so adding/multiplying tens of numbers can be as fast if not faster than division. That's probably the reason behind all the ridiculous math - to speed up everything by doing what computers are best at: addition, so it can be optimized (in terms of algorhytms) furthermore without major performance hits.
This is the most intuitive explanation of machine learning I've watched. I hope you return back to it soon!
It’s such a bittersweet moment watching your videos as they’ve recently come out because I know such great content with such level of detail takes so long to produce and it’s going to be a long and sad time until your next video comes out. I just love your videos man, everything you touch becomes gold, you make so many topics that are so boringly taught at uni seem sooooo exciting!
I cannot believe how lucky I am (how ALL of us are) that you sir exist. This quality, effort and precision put into these videos... Just wow... Thank you!
Sebastian Lague, your videos might go down in history as the most well-produced educational videos on UA-cam of all time. And it’s hard to even say that something beats 3B1B’s videos.
yes but there is another insane, underrated guy who, imho, exceeds Seb in terms of production value: braintruffle
@@SamirPatnaik What videos does he create?
@@jaideepshekhar4621 From what I got from a quick search, it seems to be focused on simulating stuff
@@jaideepshekhar4621 he creates videos that no one understands even PhD levels struggle with his explanations, so it's everything about the production.
His videos are very very high level Physics.
I love every single video that you make, you make such amazing stuff! I wish you could bring back the "How computers work" series which made me discover my passion ★
Thanks, I'm happy you enjoy them! Might be a little while before the next computers video, but I do still have plans for that series!
@@SebastianLague 😄
I also really liked the how computers work series .It is definitely one of the fundamentals of computing I knew the least about ,along with many other people I assume .
In the mean while, if you havent already check out Ben eater's video's! He has some great explanations from transistor level logic gates, to how these can be used to create all building blocks of a basic processor and how it handles assembly code
I began loving this video the second I realized you had explained derivatives without actually mentioning them. I love practical approaches!
It's impressive what Sebastian can achieve with enough time and dedication
It's amazing what time and dedication can achieve... with enough Sebastian.
?
Don't underestimate yourself, you too can do these kinds of things enough time and dedication, the dedication is the impressive part
Anyone*
hello, V
Whoever you are random stranger on the interwebs, I'm telling you: You can do the same! All you need is dedication, which helps you find the time.
Holy crap that car in the beginning even found like a line of best fit. Crazy.
Having worked extensively with neural networks some 10 years ago, I must say this is hands-down the best explanation I have seen for people who are new to it. Excellent visualisations and explanations. It is so great for someone to start working on the absolute basics (simple perceptron) and working up, instead of directly going to PyTorch of TensorFlow.
Best on UA-cam, the vibe is insane! Love you and your videos
I'm in awe of how good this content is. Production values are superb: voice is really easy to listen to with clear diction and pleasing accent. Graphics clear, smooth, and helpful. Content is out of this world. The explanations are great. I've not seen someone cover the whole thing virtually from scratch and yet at no stage does it feel like we're getting bogged down.
bros mouse movement is so majestic
Speak on that
You´re such a gifted educator, I really appreciate the quality and the amount of work that goes into these videos, inspiring stuff!
This is stunningly good. I love the visualisations.
Ok, the way you explained derivatives at 25:00 makes soooo much more sense. It perfectly presents why first derivative of x^2+ax+b is 2x+a. I'm now so angry this wasn't explained to me properly when I was in university. I also believe you are secretly a maths professor.
I would argue your Highschool teacher should've taught you derivatives. Unless, however, you took basic math in Highschool (which isn't a bad thing)
@@jamesmnguyen not everyone went to school in the same system you did friend
@@KnowledgePerformance7 I understand that.
You can keep at most 4 new things in your brain at a time. Perhaps your lecturer explained tangents, secants, slopes, limits, derivative limit definition, specific derivative calculations, derivative dx notation, derivative prime notation, ect in too few lessons. Now that you are familiar with these things it is much easier to follow.
i dont think he is a math professor because he can actually explain it xD
This is probably the best (99.5% confident, 0.02% confident that it is second best) video/tutorial on neural networks I have seen. Your intuitive explanations coupled with showing the naive to optimal approaches is very effective. Definitely showing this at my school's computer science club in September!
I love the visual explanations and the hand setting of the weights, its a really intuitive explanation for the networks
Easily the best vid on neural network. I have been looking for a good explanation on haw neural networks work, but none of them explained it in such understandable way. THANK YOU SO MUCH
I couldn't use any other word besides perfect to describe this video. The visuals, explanations and overall progression were exceptional and on point, every concept contributed to the next one, nothing was rushed or under/overly exposed.
I wish more people have the opportunity to find your amazing channel, because these videos are truly special.
New to the channel, I see :) This guy is great.
This is the most comprehensive neural network video on UA-cam. I have been tinkering with toy networks just for fun, and I am interested mostly in whys and hows of these systems. In this single video you gave me all the answers I have been looking for. Thank you very much.
You're such a great educator/communicator. You explained gradient descent in a way that makes simple intuitive sense to me in about 2 minutes, whereas it's something I've only understood abstractly in over 15 years of trying to understand this stuff!
Just found this video again. This is the video my team and I watched before going to our first coding competition. We have come so far. This is where we started and we‘re at programming networks such as yolo, rcnn, … now. damn.
thank you 🙏
I've always wanted a great demonstration of programming in a similar way someone shows progress of machining something on a lathe/mill/forge/woodworking. Most other videos skip over "all the boring bits" when they get to some coding, but you have absolutely nailed it ❤
Might you be talking about "Stuff Made Here"? It's my only complaint about that channel. I mean, yeah, I'm a programmer so maybe I'm more interested in the programming bits than the average person, and I can also understand it can be difficult to coherently show code in an edited video. But a lot of his projects are something like 50% or more code, and he's a wicked smart programmer, so I wish he included more of it.
On the extreme end of "demonstrating programming," Handmade Hero is a (very) long running series of originally-streamed videos where a professional systems game dev builds a modern, professional game/engine from scratch in C. (Really it's C++ but it's a very C-like subset of features.) Even if you don't watch the whole thing, the first ~20-30 videos are enormously enlightening w/r/t the fundamentals of lower level application programming. (e.g., I now know how you can use the Win32 API to open windows, handle keyboard/controller input, play sound, and so on.) Near the end of that first streak of Windows-specific videos he shows off some truly remarkable techniques for custom hot reloading and input recording/playback.
@@tz4601 661 videos with each video nearly 1 hour starting from 7 years ago. Just respect.
I had the intention to build a game engine with C but then I realized it is much harder to write in pure C and I didnt wanna learn C++. I also realized that I was just motivated by all the youtube videos where people code their games/engines from scratch and had little idea how one works. So I dropped the engine idea for now (planning for later) and started to build a terminal text file editor with C. Even that simple project had gotten me so far (far in my own terms) that I had become comfortable with libraries, compilers, linking them, how to use them searching for old documentations etc. I dont know where I am going with explaining these lol.
As for the recommendation you gave, thanks. I probably wont watch it because it is hella long and I like to learn hands on with trial and error, searching for problems on my own. But if I ever start to code an engine and I have a problem that I cant solve no matter what this will probably be my go to source.
The series of 3Blue1Brown on neural networks is kinda similar and dives even more deeper into the calculus.
This video is so beautiful. I'm working on a college project with Neural Networks, and you, sir, have given me such deep insight into how one must approach the same. Thank you and may God bless you
You have never made a bad video. My favourite video in the notification box updates every time you upload, from your last video to this one. Anyway, point being, you make my favourite videos on this platform.
One of the best videos about neural networks I have ever seen!
Sebastian Lague is the only UA-camr who can make a 55 minute long video feel like no time at all
Your videos are just like air shows. I never quite know what is happening, and almost everything goes over my head, but I still have a lot of fun.
This Channel has been like Bob Ross for me. I may have very little idea how to code like him, but I love watching how excited he gets over his projects.
Great video! I'm a senior college student studying computer science at my local university. I'm taken a few machine learning classes, and this kinda of stuff really fascinates me. I really enjoy the explanation about derivatives and how they fit in to the design of a neural network.
Theres a couple things you could possibly do to further improve the performance on the datasets. One thing you could do is known as cross validation. The idea with cross validation is that you take the training data and further split it into some number of "folds", where you train on each fold except for one and validate the average training score on the last fold, and do this for each set of folds. This is turn makes the predictions must more robust and prevents overfitting.
Another thing you could do is known as grid search, where you test the various parameters and functions (known as hyperparameters in ML speak), training the model for each combination of parameters, and picking the best performing model. This is very brute force (and as far as I know there is no easy way to optimize it), so bare that in mind.
You can also combine these two techniques, doing Cross Validation on each version of the model in the grid search. This however can be very computationally taxing (especially for a neural network), but in some cases you can improve the test scores.
One last thing you can do is apply a scaling function to the inputs before you train the model. This is because neural networks tend to perform better when the input values are close to each other. (Don't ask me how, because frankly I don't know). For example with the handwritten digits, all of the pixel values are between 0 and 255, so you could simply divide all the pixel values by 255 to get them to be between 0 and 1. This will be a bit more complicated for color images, though.
I haven't had too much experience working with neural networks, so I'm not completely sure if these techniques will work for you. I did however do a final project one semester where we trained a neural network to recognize the hand written digits you showcased. We did a similar thing to what I described above and got some good results. Our best test score ended up being 98.2%
But honestly, the neural network you designed is doing really good already!
This is actually an amazing explanation of backpropagation, good job!
I would recommend to anyone who wants to learn backpropagation to just take a piece of paper and a pencil, and derive those formulas themselves.
Trust me, it helped me immensly when I tried to learn backpropagation a few months ago
6 minutes in and I got one of the most intuitive explanations of weights and biases
This is by far the best explanation I have ever seen of Neural networks, not because it is particularly precise, but because it does a fantastic job at making the concepts of nodes, weights, biases, activation functions, gradient descent seem very intuitive, which is rare for such a theoretical subject. Outstanding job!
You are a freaking genius. My Professor failed to explain that stuff in a half year. you did it in a one Hour video. You are just brilliant.
I never understood maths until I started attempting basic code. These videos are the first ones that actually make sense because I can properly visualize the maths.
Dude just explained multivariable calculus + machine learning and genetic algorithms very comprehensively, extremely well written script.
Keep up the great work
Please continue to make hour long videos, it's so relaxing to watch !
I am astonished by the cavalier way that Sebastian charges through Calculus and Differential Equations and somehow manages to make it clear and understandable. It's very impressive to see someone take topics as deep and difficult as these and be able to extract exactly the amount of information needed to illustrate the point clearly without getting bogged down. Pretty cool to watch for that reason alone.
This video is looking awesome, do you have you're own animation library like manim or the one from aarthificial or Freya Holmer?
Thanks! I have my own library, although that’s a generous term to be honest, it’s more like a loose collection of scripts scattered across several projects (which I always have to hunt down and repurpose for whatever I’m working on). Really need to improve that some day! I have recently been experimenting with Freya’s (brilliant) Shapes plugin for rendering lines and points though, so some of the graphs and a few other things in this video are using that.
@@SebastianLague Been a huge fan for a long time! Would you be able to share more about your library? I've used manim a lot but I'd be very interested to see how you create your animations, even if you only show a small bit of it.
This video has so much animation. Did it take longer to code the project itself or make the video?
It's awesome how, after even having classes in college about neural networks, I finally understood how they work *in practice*. I studied the theory and saw a lot of "for dummies" explanations about NNs, but people usually abstract the actual code from their explanations and this used to frustrate me a lot. Thank you so much for the explanations, Sebastian; your content is gold.
I've been working on a basic neural network just because I wanted to learn this, I already knew the basics but it' so cool to see you working on these. This really helped cement the knowledge I learned.
This is far and away the best explanation of NN I've ever seen, and I took college courses on NN's. The included explanation of the calculus behind it was amazing as well
Any reasonable person: “I will use one of the many existing Python libraries that implement backpropagation for me in an efficient and easy-to-use way”
Sebastian: “I like writing C#”
Jokes aside, very cool and impressive project as always. And of course presented in a stunning and intuitive fashion, keep ‘em coming!
“I will use one of the many existing Python libraries that implement backpropagation for me in an efficient and easy-to-use way”
I hate that its like 99% of online courses
The reasonable person learned nothing.
@@weckar
The reasonable person learned you don't have to reinvent the wheel every time you want to build a wagon.
@@notahotshot pretty good to know how a wheel is made for your first wagon
@@notahotshot That's great until you need to heavily customize a wagon for a new kind of task, but those old wheels somehow don't work well with it and you have no idea why, because you never learned how these wheels even work. This is a very common problem in programming in recent years. There are so many programmers now who never learned basics (eg. they often don't even consider they operate on physical hardware with actual memory), new apps with the same UI and functionality we were using 20+ year ago start lagging horribly on a 10,000x faster computer than what we had back then. Oh and you better have a few GB of memory for those nice fonts...
A one hour long Sebastian video about neural network just gives me so much hype to enjoy and learn
I've spent many hours learning about ML algorithms, with a lot of that time spent on MLPs. I've working with PyTorch and TensorFlow before. I thought I understood everything pretty thoroughly but just that first example of manually tweaking weights and biases and seeing how those affect the output graph showed me something new. You're videos are incredible and inspire a whole generation of programmers.
And then there's me whos stuck on regressions for the past few days.
This is gold. I've seen many videos in my quest to fully understand this stuff. I have never seen such an intuitive video on this topic, ever. And in just 54 minutes, you clearly walked us through weights and biases, non linearity purposes, gradient descent, and calculus intuition! Yeah I've just subscribed, keep being a legend.
i've had trouble for 5 years learning derivatives, dropped out of universities cause no one would explain to me why we use them, just how. Your channel is a blessing
Did you take the optional math course?
It was probably taught there
@@polygontower Did you go to the same educational institute as them? What's that? You don't know? Then who are you to make such a comment?
@@tams805 No. An educated guess.
An educated guess. An educated guess.
@@polygontower someone has learned something from grade 11 chapter 1 mathematics.
Always take an educated guess.
@@polygontower i just want to know why you repeated that 3 times
That's a really nice way of introducing the neural networks. Explaining the different parameters and fiddling with them by hand + the visualization, before starting to explain how we can make our computer do the fiddling. Very cool idea!
As someone who watched a lot of these videos when writing my Bachelor's thesis on NNUEs (a specific kind of neural network for chess), I can safely say this is the best introduction to neural networks I've seen. Absolutely love all the visualization, how you start from the ground up but still include the calculus etc. Fun fact: my thesis was somewhat inspired by your Chess Engine video as well. I love your content, becoming a patreon now!
Thank you!
I wanted to write my bachelor thesis on chess algorithms, but my professor told me it was too complicated… would you mind sharing your work? I would love to check it out!!
THANK YOU and I'm Impressed. I'm an old school, Career “System's Analyst" AKA "Software Engineer". My Coding skills are extensive. BUT THIS - I've been working on this on my own (and considering going back to college for refresher courses targeting this subject matter).
.
I JUST Needed to TELL YOU - This video was a big help for the holes in research... especially in explanations of "how" and "when" to adjust "what" weight.
.
THE BEST PARTS;
(1) Now days, Software I work on is written in C++, (from Assembly, COBOL, Modula II, etc, days). I read and understand Code better than I read understand explanations in Books 😂LOL! Thank you for explanations in C# code.
.
(2) Your demonstrations of the LESS Efficient way, explains MORE, prior to your demonstration of the more Efficient (e.g. Cost)). THANK you for including things like that during this video.
I know you're not gonna see this message, but I hope you know that what you do is amazing. I am still in highschool, haven't learned calculus yet, and even so, your explanation of calculus just made so much sense to me. Everytime I tried to learn calculus I could not wrap my head around what derivatives were or what it is meant to do, but I understood it in a few minutes just from your example! Thank you, Sebastian.
One of the best tutorials I have come across for ANNs in the 2-3 years I've been working on them. Very intuitive, bravo. You must really know your stuff.
Props to you for not just using sklearn or PyTorch! You actually built the NN from nothing. That’s a dream of mine.
this video was so entertaining and i actually come from syria and my english is pretty bad but this video was so well explained i didnt even have to rewind the video, thank you very much for these videos and i hope you keep making such amazing and educative videos. PLEASE keep making these videos, i never get bored of this even thought i don't use this for something useful.
I’m happy you enjoyed it, thank you!
Sebastian :"Just open your favorite code editor, type in a few lines and there is your multiverse simulator with special effects"
Always a treat to watch your "magic" man :)
This video is legitimately incredible. I genuinely believe this is one of the best explanation videos I've seen, across this whole site.
From just the first couple of minutes of your video, I was able to code my own working classifier. I love you build up everything from first principles and also show your first "naive" implementations before moving on to the more optimized versions. It really makes all the moving parts easy to understand. Excellent work!
The caliber, quality and attention to detail in your videos is outstanding. I gained a better, and more fundamental, understanding about neural networks from watching this video that studying them for a whole semester at university. Thank you!
I wish I had seen this video when I was in an AI class back then. It would have been so much easier.
Sebastian... What an absolutely wonderfully calm and suthing voice you have. If coding doesn't work out for you, I'm sure you could get a job narrating stuff.
best video on machine learning that I have ever seen, how is this even free? This is what the internet & youtube was made for.
Adding on to the pile: most intuitive video about the underlying concepts of neural networks EVER!
Sometimes I forget this guy is real and not just a voice in my head teaching me everything
As per usual your videos always fall in to one of two categories for me: 1. A calming exploration of programming explained with intuitive examples and metaphors or 2. Black Magic explained with intuitive examples and metaphors. Either way, it has never been an easier choice to support someone's Patreon. You're a gem.
How people are this smart just boggles me, I love how well you simplify things though, the little simulations are just perfect once again :)
@Sebastian Lauge
Thanks for another masterpiece of clear illustrations and explanations.
And just in case that it somehow is not absolutely painfully obvious, we all massively enjoy Your "ridiculously long" videos. Not to mention the absolutely crazy amount of work You put into making them both interesting and clear.
Kudos and best regards.
I'm happy you liked it, thank you!
Been learning a lot from you last two weeks. You are amazing
I wrapped up my MS in CS focusing on ML and CV last year, and I could have replaced hours of boring lectures with just this one video... especially when digging into neural networks. I'd have needed to know all the activiation functions and other details, but this is a great introduction to the concept done in an entertaining way.
Convolutional NNs are the key to creating good image-recognition networks (at least at the moment). It's pretty amazing how a couple of straightforward image transformations add so much information and allows CNNs to gain such an edge over traditional feed-forward networks. In fact, these convolution-like processes are even used by humans, such as line detection.
Your videos and explanations are incredible, I would love to see you dive deeper into the amazing field of ML!
I wonder if a simple luminance/chrominance encoding of images could already significantly improve the performance of a simple networks such as this one for the last exercise.
@@Flobyby unlikely. Maybe, for really small networks it might, but in general feature engineering (manually preprocessing the data) is almost always useless for neural networks, because the network can learn the best transformations for each particular task on its own. Actually, that's kind of the whole point of deep learning - to avoid having to manually hard code the features.
One of the best videos for explaining neural networks internal mechanisms without presenting them overly complex. Well done!
This video is probably the best video about neural network and machine learning I have ever seen.
Everything from the calculus to the neural network itself was really well explained.
It's a great video and I really appreciate the effort you put into visualizing those concepts :) I made a little note
6:50, you said that it doesn't make sense to change the size of input or output. For neural networks it would be impractical, yeah. However, for many other classifiers you can increase the number of inputs, that includes making logistic model to have a "bendy" decision boundary. You may for example add an input and make it be a nonlinear function of another input, like input one squared, sine of input two, euclidean distance between a data point and 0,0. Then you can train linear model on this augmented data and it will be able to have bendy decision boundary in input space. That's what eg. SVMs use and it's called a kernel trick - making a nonlinear problem a linear problem in nonlinear space
Good point, thank you!
Amazingly well formatted and highly informational video, went from near 0 experience in Neural Networks to understanding to making my first Network in MINUTES!!!
Did I just watch an hour long calculus course and not only i think I understood everything and going to try on my own but wanted it to last 2 hours longer? My god this channel never fail to impress me by its quality.
The best and the more comprehensive introduction to neural networks i have ever seen
Lovely video as always! In your doodle of the helicopter problem, you might be testing it with a thinner line (or less opaque) than the training set. Thus if the intencity of the line is affecting the activation of the neurons, you might not get the correct result. This could be the case for the numbers as well. The training set with numbers seems to be drawn with a thicker line than the one you use to test it with.
So, in theory, vectorization of the input would help?
As a data scientist, this might be the best tutorial on the fundamentals of machine learning I've seen.
That graph renderer you made looks almost exactly like desmos with a dark theme enforces by the dark reader extension.
This is perhaps the best introduction to the basics of neural networks that I've come across.
The learn rate part reminded me of damped harmonic oscillation. For a damping coefficient that's too low it will start to oscillate around the steady state, if it's too high it will decrease but not very fast, but if it's critically damped it will return as fast as possible to the bottom.
Thank you so much for this! I think this is one of the first neural network videos that are fun and game-oriented, yet don't just gloss over the hidden layer and go "yeah it's machine magic it works" and then only focuses on the output and input. Very nice, helped a lot in grasping the concept. Please make more about other AI algorithms :D
I like that you condensed a whole semester course of artificial intelligence at university into 55 minutes :D
Hi sebastian Lague, I have just started Calculus in High School, and was confused on the application of this math that I was learning, but I genuinely think that this is one of the best explanations of Calculus I have ever seen, as it actually gives a way to use it. Thank you so much I really appreciate it.
Hey man I think you would really find plate tectonics simulations interesting. It kind of fits well with your procedural terrain generation series and is extremely interesting to read about.
As someone who recently studied a Data Science course, I guarantee you that you taught it way better than the professor. The visualizations, actual code, and your general calm state of mind and the pace you keep is amazing. Really looking forward to seeing more videos about advanced computer science concepts!
This is exactly what I learned in my first year calc class. Without the fancy visualization. I feel like I understand it better now 😂
i feel like you have a natural talent for explaining things as optimally as possible.. There just enough information to not overload or skip through too much important stuff... and its not just this video, its all your videos.
The way he slowly breaks the problems into smaller chunks was making even my small brain could digest it tho:D