The full Neural Networks playlist, from the basics to deep learning, is here: ua-cam.com/video/CqOfi41LfDw/v-deo.html Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/ NOTE: A lot of people ask where the values on each connection come from that we use to multiply and add to the inputs. This question is answered at 3:48 - in short, they are found using a method called Backpropagation. If you want to learn more about backpropagation, check out this video: ua-cam.com/video/IN2XmBhILt4/v-deo.html
@@yashaswikulshreshtha1588 what is wrong with you? The math presented is NOT difficult, it is literally basic calculus. Yes, we do understand the math. If you are struggling with this, I recommend that you retake calculus and algebra then come back here once you have a good foundation. None of this will make sense if your math knowledge is weak.
I am a professor in computer science, and did my dissertation on Bayesian methods in AI, and I will say that this is the best explanation of neural networks I have ever seen. It does no hand-waving, and makes no assumptions about the viewer's math skills. Coming late to the CS party in my 20s, and not being from a strong math background, I can really appreciate this approach. I will be using this video series in my Intelligent Systems class. Thank you.
Wow! Thank you very much! Since you like the video, I'll be shameless and say that you might also like...the book! I cover this and a bunch of other ML topics in The StatQuest Illustrated Guide to Machine Learning here: www.amazon.com/dp/B0BLM4TLPY
For Christ sake, this guy is brilliant! Taking something very intimidating and showing how it works is worth gold. I am embarrassed to say that I am a scientist, have been for over 20 years. We over complicate and write papers to sound smart, but really never really say anything. And good luck ever trying to reproduce those results in the paper. This is how technology and science should be taught and discussed with our peers.
People like you are shaping the future of students and people who are willing to learn. May God bless you and fulfill all your dreams. Josh you are such an amazing human being.
Triple BAM congratulations! I bled through this six years ago , and that took me months. Here, you condensed the concepts into 18 mins! and I still learned alot ! You are a blessing!! please keep posting more videos!
I had tried to learn Neural Networks multiple times only to find that it gets more and more complicated due to the equations and terminologies. This is the first explanation I have come across that is so clear and concise and to the point, without going into equations, jargons. Thank you so much.
Masters Student here, went for my first NLP class and didnt understand a word. After Class, started watching the series, Now I can understand what my prof says in class .. :)
I just can't express how grateful I am for these videos, you have a stunning ability to make concepts stick to my and other people's mind. You're the greatest in this field!
I've been struggling to understand neural networks until i stumbled upon this video. This is the best explanation with the best presentation (I agree fully on using easy to understand visualization instead of those fancy one). I don't usually write comments, but I feel the need to thank you for this. Thank you so much!
You know you are contributing massively to the society when I look forward to your video releases just as much as I look forward to some of Marvel's blockbuster films!! :D
I think the novel value of this video (for me) is that it helps bridge the gap left with 3Blue1Brown's video where he intuites what the NN may be doing, but then reveals that it actually appears random and chaotic, without the sense of order we initially assumed to make sense of it. This video explains that although the particular weights and biases may appear random at first glance, it is through their summation/cancellation that an order emerges. Also, kudos for introducing a new beginner example -- I genuinely did not realise that there were other applications outside of image recognition lol (I am a beginner to this!). As in, yes it's easy for us to look at the simple data and say 'hurr durr why not just model it with a quadratic equation'; but here is a different way to derive values to fit the data on the graph. Which was new to me, so, thanks!
Each time I've wondered why did I make a several years hiatus in my masters I was a bit sad because I didn't get to a satisfying conclusion. Now I see it clear, I had to wait until Stat Quest covered the whole material of my ML course to properly get it haha! Thanks again for making it that easy!
@Ritik Yadav Yes, I follow that channel too. His animations are also awesome!😍 But that time I couldn't understand it well because I didn't understand the previous things that is required to understand that. That is where Statquest came as a Savior!❤️
The best Neural Networks explanation i've ever seen!! I went through many and all of them were pretty hard for me to understand. Keep up the good work man, you're doing great.
Most people have the abysmal quality of mystifying what is simple, while you have the superpower of demystifying what is complex! Thank you for these amazing videos!
I'm a new data scientist who recently changed careers from clinical psychology and by God, I wish I knew about your channel and your book much sooner! You sir are a godsend.
Actual you are one of the greatest teachers of all times ! you make the most complicated subjects look sooooo easy that I was having breakfast while watching this ... and though I understood the dreaded Neural Networks 🤣🤣 Thanks alot josh, I wish you the best ❤
math, graphics, and progression shown combining the two is priceless. i couldnt get this from college professors, this video makes it clear. ...and i see how this now connects to Maths, maxima,minima,calculaus.... Appreciate the amount of efforts to make this visually motivating video.
These videos are just incredible. I can't imagine the effort behind each video. I'm just starting out in this and don't doubt that your channel will be one of my first options whenever I have any doubts. The simplicity and especially the way you explain it, is totally a 10. Thank you very much.
I'm so glad I found this gem. You actually provided the explanation that I have been seeking. The internet is flooded with people trying to explain neural networks, who have no idea themseleves but just parrot other peoples PowerPoint slides.
Please don't stop posting videos like this. I'm planning to become a Machine learning engineer with help of your videos which is more than enough. What a man you are! oh my god! hat's off!!!!!!!!!!!!!
To those of you who is still considering whether to watch this tutorial or not. This tutorial is well explained, there is nothing but benefits in seeing this it. As the lecture already says in the beginning of the video, he is simplifying the "hard" terminologies and mathematical notations. He is even emphasising the important part by saying BAM, double BAM and triple BAM which is actually a kind of energy boost since it is funny. This is my third time watching this tutorial, first time was to get general understanding of the topic, second time to apply the mathematical knowledge. This time I'll apply the mathematical knowledge and code an algorithm from scratch. Wish me luck. Lastly, thanks for the tutorials. Used many of your tutorials especially the statistical one, which helped me get a high grade in University
Waow! This tutorial was an amazing asset for revising my concepts. Also, I haven't seen such an amazing teacher who clarifies these messy concepts, and makes easy to understand for us. Thanks a lot Sir Jost Starmer 🙂
These videos have explained the ideas so clearly in such a simple way. They literally helped me pass my actuarial exams. A big thank you for your time and amazing work!!
We just enjoy learning and playing with knowledge with Josh. This is all because of his efforts, we're learning so well. Hats off to this man, spending his time spreading knowledge :D
When somebody says "Learning is fun" these videos seriously are the best proof. In lecture ive always felt overwhelmed by the maths and watching your videos breaks it down in such a simplicity that you have the necessary foundation to understand the complexity that follows. Thank you so much and happy new year 2024!!!
Been waiting for this topic ever since I discovered your channel a year ago. Can't wait to see the next parts!!!! Thank you, an amazing job as always :D
Thanks Josh for helping remove the gatekeeping on Machine Learning for folks! It often is held as a this thing only prodigy's or math wizards can do, but you are helping to make it accessible for everyone! Keep up the amazing work.
I'm going to present my master's degree defence next month and this video, and the ones about PCA and SVM, helped me *a lot* to break down these complicated processes into something that I can actually try and explain to the professors that will be evaluating me. Thank you SO MUCH for that!
@@statquest Just thought I'd let you know that I was approved and now have a MSc degree in biochemistry and molecular biology! Thank you very much for your help! 😃
By far, and i mean by real far, this is the best, complete, and accurate explanation of what is a neural network. No one explains it like that. Even seasoned data scientist don't know exactly in details what's the deal with ANNs
Man if only i had discovered this channel a while back when I was having my stats and data analysis classes, I wouldn't have performed so terribly x) Good thing that I found out about it now so I still can re-learn all what I missed on, now that I need that knowledge the most. Thank you so much Mr Starmer, you and your videos are wholesome beyond belief
now that was clarifying. And there are more parts in this playlist :) The amount of work you've put in these tutorials alone is worth a subscription. Thanks a lot
My friend recommended this channel. You are so brilliant! This singing was a perfect touch! I am just learning these concepts and all the professors. Assume we are supposed to understand what’s going on from get-go even if it’s the concept I encounter first time in my life! Thank you!
Amazing, Josh! I wish you and your channel existed in my life when I was younger! I'd definitely had done different career choices with this way of looking mathematics and statistics you presented in your videos! 🇧🇷🇧🇷🇧🇷🇧🇷
I have a pretty chill phd supervisor so wondering if i can just get away with calling them 'big fancy squiggle fitting machine' for the entirety of my thesis...
Hello, Josh. Your videos are really helpful for my research and I like them so much. Btw, I really want to see you to explain mathematics theories and terminology behind Time Series Data Analysis, such as ARIMA, SARIMA or some other machine learning algorithms! BAM me! Thank you! (also from a model family fans, I guess you are too :)
neural network consists of nodes and weights (like slope) and biases (like y intercept) and it's like finding a non-straight line curve that fits your data you start with unknown w's and b's and estimate their values through back propagation input, hidden and output layers. hidden layers' nodes use activation functions like sigmoid, relu, softplus that are curves themselves and 'add' them together to get a final curve as output to predict data input layer has x. x becomes wx+b. then you do activation function f(wx+b) to get a y value. [relu ge it's max(0,x). sigmoid ge it's exp(x)/( exp(x)+1 ) and for softplus it's ln(1+exp(x)) you do this for all x values in the range and you'll get a range of y values that correspond (actually) to a very small range in the activation function [eg: x goes from from 0-1 but wx+b will correspond to a much smaller range of the activation function. so this means that different ranges of the activation function might give rise to different looking curves] each node in the hidden layers will give you one kind of curve and then the calculation keeps on going for how many ever nodes necessary. let's skip to last hidden layer. the curves have been calculated. you have intermediate y values ready. now add all of them to get a different looking final curve. set of y values corresponding to this final curve are your output binary classification: if that y values is close to 1, predict success. if y value is close to 0, predict failure
Josh, that was simply fantastic. I've watched a lot of videos and taken a course in NNs, and I've never seen an explanation like this. This was exactly what I was looking for. I don't know how you know what you know, or why no one else seems to be able to teach like you do, but THANK YOU!!!
I have done tons of coursework on NNs, worked with them for years but you taught me something I never even thought to ask. Amazing job Josh, well done and thank you!
You made laugh, get interest and BAAMM! I'm studying a course of Machine Learning and got a profound level of stats, so I needed a couple of videos of depth and I found you. This channel is amazing, I'm gonna watch everything. You're awesome.
The full Neural Networks playlist, from the basics to deep learning, is here: ua-cam.com/video/CqOfi41LfDw/v-deo.html
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
NOTE: A lot of people ask where the values on each connection come from that we use to multiply and add to the inputs. This question is answered at 3:48 - in short, they are found using a method called Backpropagation. If you want to learn more about backpropagation, check out this video: ua-cam.com/video/IN2XmBhILt4/v-deo.html
Bamm
Calling it squiggle fitting mechine is genius
I can't believe I've only just found out you published a book! I just bought it, can't wait for it to arrive! ❤
@@ThisGuy0903 Hooray!!! Thank you so much for your support! :)
just wooow...never thought someone could explain things in this way.. GREAT WORK
someone give this guy an elixir of eternal life for him never stops to do videos like that
Thanks! :)
How can you understand what the heck he is talking about. It's like extreme amounts of math equations, can you solve those equations?
@@yashaswikulshreshtha1588 what is wrong with you? The math presented is NOT difficult, it is literally basic calculus. Yes, we do understand the math. If you are struggling with this, I recommend that you retake calculus and algebra then come back here once you have a good foundation. None of this will make sense if your math knowledge is weak.
@@navjotsingh2251 Yeah honestly sometimes I like maths but idk if it's me or my teacher never explained to me properly.
@@navjotsingh2251 Exactly when I posted this comment honestly I didn't have any of those courses neither calculus. So it was overwhelm for me
I am a professor in computer science, and did my dissertation on Bayesian methods in AI, and I will say that this is the best explanation of neural networks I have ever seen. It does no hand-waving, and makes no assumptions about the viewer's math skills. Coming late to the CS party in my 20s, and not being from a strong math background, I can really appreciate this approach. I will be using this video series in my Intelligent Systems class. Thank you.
Wow! Thank you very much! Since you like the video, I'll be shameless and say that you might also like...the book! I cover this and a bunch of other ML topics in The StatQuest Illustrated Guide to Machine Learning here: www.amazon.com/dp/B0BLM4TLPY
@@statquest I will check it out!!
The tutorials are amazing. Love the simplicity you bring to the allegedly complicated concepts.
WOW! Thank you so much for supporting StatQuest! BAM! :)
*As a UA-camr myself, I really appreciate the amount of work that must have been put into this. Amazing video! :)*
Thank you very much! :)
All the best for your channel ..
@@abhishekm4996 Thank you! :)
100% Agree
As a non-UA-camr, I can also appreciate the amount of work that has gone into this! ❣️
For Christ sake, this guy is brilliant! Taking something very intimidating and showing how it works is worth gold. I am embarrassed to say that I am a scientist, have been for over 20 years. We over complicate and write papers to sound smart, but really never really say anything. And good luck ever trying to reproduce those results in the paper. This is how technology and science should be taught and discussed with our peers.
Wow, thank you!
People like you are shaping the future of students and people who are willing to learn. May God bless you and fulfill all your dreams. Josh you are such an amazing human being.
Thank you very much! :)
Thank you very much! :)
Triple BAM congratulations! I bled through this six years ago , and that took me months. Here, you condensed the concepts into 18 mins! and I still learned alot ! You are a blessing!! please keep posting more videos!
Wow, thank you!
I had tried to learn Neural Networks multiple times only to find that it gets more and more complicated due to the equations and terminologies. This is the first explanation I have come across that is so clear and concise and to the point, without going into equations, jargons. Thank you so much.
Thank you! I'm glad to hear that the video was helpful!
Masters Student here, went for my first NLP class and didnt understand a word. After Class, started watching the series, Now I can understand what my prof says in class .. :)
bam! :)
Taking machine learning right now and my professor sucks, these videos are a huge help and help things make sense
I'm glad they are helpful! :)
I just can't express how grateful I am for these videos, you have a stunning ability to make concepts stick to my and other people's mind. You're the greatest in this field!
Wow, thank you!
Man, thank you for this. This is the most simple and concise way to explain NN I've ever seen online.
Glad you liked it!
These videos have literally carried my Master's thesis please never stop doing God's work
Thanks! and good luck!
I've been struggling to understand neural networks until i stumbled upon this video. This is the best explanation with the best presentation (I agree fully on using easy to understand visualization instead of those fancy one). I don't usually write comments, but I feel the need to thank you for this. Thank you so much!
Thank you very much! :)
You know you are contributing massively to the society when I look forward to your video releases just as much as I look forward to some of Marvel's blockbuster films!! :D
BAM! :)
@Jody Jones BAM! :)
Can I just say statquest videos >>>>> marvel films
@@unknownpleasures100 DOUBLE BAM! :)
I think the novel value of this video (for me) is that it helps bridge the gap left with 3Blue1Brown's video where he intuites what the NN may be doing, but then reveals that it actually appears random and chaotic, without the sense of order we initially assumed to make sense of it. This video explains that although the particular weights and biases may appear random at first glance, it is through their summation/cancellation that an order emerges. Also, kudos for introducing a new beginner example -- I genuinely did not realise that there were other applications outside of image recognition lol (I am a beginner to this!). As in, yes it's easy for us to look at the simple data and say 'hurr durr why not just model it with a quadratic equation'; but here is a different way to derive values to fit the data on the graph. Which was new to me, so, thanks!
BAM! I'm glad you liked the video and the example. :)
This is the best explanation of neural networks that I've ever seen. Thank you!
Wow, thank you!
Each time I've wondered why did I make a several years hiatus in my masters I was a bit sad because I didn't get to a satisfying conclusion. Now I see it clear, I had to wait until Stat Quest covered the whole material of my ML course to properly get it haha! Thanks again for making it that easy!
bam! :)
Honestly I was so filled with joy when the notification came. I really missed your explanation in this topic. Thank you so much!!
@Ritik Yadav Yes, I follow that channel too. His animations are also awesome!😍 But that time I couldn't understand it well because I didn't understand the previous things that is required to understand that. That is where Statquest came as a Savior!❤️
Thank you very much! :)
@@statquest Sir one heart pls🥺
@@GeekyMan lol!
@@hugolousa what's so funny?
The best Neural Networks explanation i've ever seen!! I went through many and all of them were pretty hard for me to understand. Keep up the good work man, you're doing great.
Wow, thanks!
Your uploads are so timely for me! When you uploaded the XGB video, I was studying XGB. Now I'm studying neural networks! Thanks for this josh!
Double BAM! :)
The same for me .. and I was surprised not to find a video of neural network on your channel. Thank you statsquest
@@statquest Triple BAM! :)
same!!
Most people have the abysmal quality of mystifying what is simple, while you have the superpower of demystifying what is complex! Thank you for these amazing videos!
Thank you very much! :)
I'm a new data scientist who recently changed careers from clinical psychology and by God, I wish I knew about your channel and your book much sooner! You sir are a godsend.
Thanks!
Oh my, you just exaplained neural networks in a way that a kid would understand. Amazing!
Thanks!
Actual you are one of the greatest teachers of all times !
you make the most complicated subjects look sooooo easy that I was having breakfast while watching this ... and though I understood the dreaded Neural Networks 🤣🤣
Thanks alot josh, I wish you the best ❤
Wow, thank you!
This is the only video on the internet that has gone so deep into neural networks.
BAM! :)
@@statquest double that :)
By far my favourite UA-cam channel
Hooray! Thank you so much!!! :)
you are a true hero. I had seen more than 10 useless clips until I learned the intuition completely from your video. Thanks a lot
Thank you!
Love the humor! The calculation narration part was epic. Learned a lot!
bam! :)
That Helloooooooooooooo at the starting of the vid is so cheerful :3
Thanks! :)
If I ever get a job after my graduation, I will donate money here... You are a GOD
bam! Good luck!
math, graphics, and progression shown combining the two is priceless. i couldnt get this from college professors, this video makes it clear. ...and i see how this now connects to Maths, maxima,minima,calculaus.... Appreciate the amount of efforts to make this visually motivating video.
Thank you!
I am an Ivy League school grad student, and I can find so many people watching your videos in the library, including myself. You saved us!
BAM!
These videos are just incredible. I can't imagine the effort behind each video. I'm just starting out in this and don't doubt that your channel will be one of my first options whenever I have any doubts. The simplicity and especially the way you explain it, is totally a 10. Thank you very much.
Wow, thank you!
I have a quiz tomorrow, zoom classes suck. so thank you. You're a life saver
Good luck and let me know how it goes. :)
You have been spotted
You once fitted a line
Now you fit a squiggle
"Softplus" made me laugh
Self-promotion made me giggle
STATQUEESTT!
TRIPLE BAM!!! I love the poem. :)
@@statquest thankks!! it was actually a rendition of the logistic regression's "awesome song and introduction"😅
@@J10098 :)
I'm so glad I found this gem. You actually provided the explanation that I have been seeking. The internet is flooded with people trying to explain neural networks, who have no idea themseleves but just parrot other peoples PowerPoint slides.
Thanks!
one of the best videos on neural nets. thank you
Hooray!!! Thank you so much for supporting StatQuest!!! TRIPLE BAM!!! :)
Please don't stop posting videos like this. I'm planning to become a Machine learning engineer with help of your videos which is more than enough. What a man you are! oh my god! hat's off!!!!!!!!!!!!!
Thanks!
To be awarded Nobel price in psychology. Reason: "not allowing people to get mad learning neural network from classical textbooks"
BAM!
Perfect timing for me, I was just starting with neural networks. DOUBLE BAM!!!
Awesome! :)
you literally save my life and human beings *crying inside*. Thank you indeed
Thanks!
To those of you who is still considering whether to watch this tutorial or not. This tutorial is well explained, there is nothing but benefits in seeing this it. As the lecture already says in the beginning of the video, he is simplifying the "hard" terminologies and mathematical notations. He is even emphasising the important part by saying BAM, double BAM and triple BAM which is actually a kind of energy boost since it is funny.
This is my third time watching this tutorial, first time was to get general understanding of the topic, second time to apply the mathematical knowledge. This time I'll apply the mathematical knowledge and code an algorithm from scratch. Wish me luck.
Lastly, thanks for the tutorials. Used many of your tutorials especially the statistical one, which helped me get a high grade in University
TRIPLE BAM! :)
Waow! This tutorial was an amazing asset for revising my concepts.
Also, I haven't seen such an amazing teacher who clarifies these messy concepts, and makes easy to understand for us.
Thanks a lot Sir Jost Starmer 🙂
Wow, thanks!
These videos have explained the ideas so clearly in such a simple way. They literally helped me pass my actuarial exams. A big thank you for your time and amazing work!!
Wow!! Congratulations!!! TRIPLE BAM! :)
We just enjoy learning and playing with knowledge with Josh. This is all because of his efforts, we're learning so well. Hats off to this man, spending his time spreading knowledge :D
Thank you very much! :)
I always had some doubts regarding mathematics behind neural networks, This video helped me so much regarding those! Next tier content!!
Glad it helped!
Just wow. I have never seen Neural networks being explained so neatly. You have my respects.
Thank you!
Good afternoon, Josh! I am another student who came here to express the abysmal gratitude for the explanations and work you put here. Thank you!!
Thank you very much! :)
Yes Josh!!! Finally on Neural Network! You're awesome!!
Hooray!!! Thanks!
The intro is what gives me joy. Everything feels so easy when Josh the God explains.
Thanks! :)
Please release a full album of your neural net calculation montage singing.
Can’t wait for the next video.
You made me laugh. Can you imagine a full 45 minutes of me beep booping? :)
@@statquest Here's hoping
When somebody says "Learning is fun" these videos seriously are the best proof. In lecture ive always felt overwhelmed by the maths and watching your videos breaks it down in such a simplicity that you have the necessary foundation to understand the complexity that follows. Thank you so much and happy new year 2024!!!
Thank you!
just started phd in astronomy and i know your channel is gonna be a pillar for my success. cheers mate!
Thanks and good luck with your PhD!
Been waiting for this topic ever since I discovered your channel a year ago. Can't wait to see the next parts!!!! Thank you, an amazing job as always :D
Thank you very much! :)
StatQuest with Josh Starmer me too,long for this topic
Won't mind if there are more ads until i get such quality content. Please try to upload all DL videos so I can prepare for interview.
Thanks
I'm working as quickly as I can and hope to have part 2 out in a few weeks.
@@statquest Also please create a playlist section for them so we can find them easily. thank you!
@@deathkillertech Will do!
What a perfect time of uploading a perfect topic. Thanks a lot josh
Thank you! :)
I clicked on your video so fast once I saw you covered neural networks...it is so reassuring to know you will have all your answers in one place.
bam!
Wow, finally someone decoded the black box. I did a Master's in Data Science and still could never truly understand neural networks. Thank you Josh
Thanks!
Thanks Josh for helping remove the gatekeeping on Machine Learning for folks! It often is held as a this thing only prodigy's or math wizards can do, but you are helping to make it accessible for everyone! Keep up the amazing work.
Thank you very much! :)
I wish I could thumbs up your comment 10 times!
The usage of "gatekeeping" just gets dumber and dumber over time.
superbly explained, this man is just amazing.
Thank you!
I'm going to present my master's degree defence next month and this video, and the ones about PCA and SVM, helped me *a lot* to break down these complicated processes into something that I can actually try and explain to the professors that will be evaluating me. Thank you SO MUCH for that!
Awesome and good luck with your defense! Let me know how it goes. :)
@@statquest Thank you, I will! :D
@@statquest Just thought I'd let you know that I was approved and now have a MSc degree in biochemistry and molecular biology! Thank you very much for your help! 😃
@@lodjr TRIPLE BAM!!! Congratulations!!! That is awesome. Raise a glass for me when you celebrate! BAM!
@@statquest Thank you! I will! 😃
By far, and i mean by real far, this is the best, complete, and accurate explanation of what is a neural network. No one explains it like that. Even seasoned data scientist don't know exactly in details what's the deal with ANNs
Thank you!
You sir, are a gift to many confused students in this field..!!
Thanks!
Thank You so much for making this series, Love from India.❤️❤️
Thank you! :)
These videos are so helpful. I’m happy I found your channel. INFINITE BAM 💥
Hooray! :)
No matter where you come from...you will definitely get something new,something interesting to learn from him...
BAM! :)
Best series on machine learning topics I have ever seen. Clearly and precisely explained. Just awesome!
Wow, thanks!
each statquest video is better than the one before it. easiest sub of my life right here
Bam! :)
Great energy, willingness to spread concepts and knowledge!
Thanks!
The best explanation for deep learning ever, can't wait for the part 2 video. Thank you for your wonderful job =)
Thank you! 😃
That was great explanation. Hope you make videos more frequently and expand the content of NN to advance topics.
That's the plan! :)
possibly the best explanation of neural networks I've ever seen. You condensed the concept into simplicity so effortlessly ! Thanks a lot Josh sir!
Thank you!
Man if only i had discovered this channel a while back when I was having my stats and data analysis classes, I wouldn't have performed so terribly x) Good thing that I found out about it now so I still can re-learn all what I missed on, now that I need that knowledge the most. Thank you so much Mr Starmer, you and your videos are wholesome beyond belief
Thank you very much! :)
15:49 - 16:18 -- someone's trying to break into the hyperpop/electro-pop scene with a brand new single :D
:)
I love your videos, thank you so much for getting us non experts closer to knowledge :)
Glad you like them!
now that was clarifying. And there are more parts in this playlist :) The amount of work you've put in these tutorials alone is worth a subscription. Thanks a lot
Thank you!
My friend recommended this channel. You are so brilliant! This singing was a perfect touch! I am just learning these concepts and all the professors. Assume we are supposed to understand what’s going on from get-go even if it’s the concept I encounter first time in my life! Thank you!
Happy to help!
Amazing, Josh! I wish you and your channel existed in my life when I was younger! I'd definitely had done different career choices with this way of looking mathematics and statistics you presented in your videos! 🇧🇷🇧🇷🇧🇷🇧🇷
Muito obrigado! :)
Boa man
I have a pretty chill phd supervisor so wondering if i can just get away with calling them 'big fancy squiggle fitting machine' for the entirety of my thesis...
That would be awesome! BAM! :)
Just a simple comment. Need heart asap.
BAM!
Hello, Josh. Your videos are really helpful for my research and I like them so much. Btw, I really want to see you to explain mathematics theories and terminology behind Time Series Data Analysis, such as ARIMA, SARIMA or some other machine learning algorithms! BAM me! Thank you! (also from a model family fans, I guess you are too :)
I hope to cover those topics sometime soon.
neural network consists of nodes and weights (like slope) and biases (like y intercept) and it's like finding a non-straight line curve that fits your data
you start with unknown w's and b's and estimate their values through back propagation
input, hidden and output layers. hidden layers' nodes use activation functions like sigmoid, relu, softplus that are curves themselves and 'add' them together to get a final curve as output to predict data
input layer has x. x becomes wx+b. then you do activation function f(wx+b) to get a y value. [relu ge it's max(0,x). sigmoid ge it's exp(x)/( exp(x)+1 ) and for softplus it's ln(1+exp(x))
you do this for all x values in the range and you'll get a range of y values that correspond (actually) to a very small range in the activation function [eg: x goes from from 0-1 but wx+b will correspond to a much smaller range of the activation function. so this means that different ranges of the activation function might give rise to different looking curves]
each node in the hidden layers will give you one kind of curve and then the calculation keeps on going for how many ever nodes necessary. let's skip to last hidden layer. the curves have been calculated. you have intermediate y values ready. now add all of them to get a different looking final curve. set of y values corresponding to this final curve are your output
binary classification: if that y values is close to 1, predict success. if y value is close to 0, predict failure
bam
Josh, that was simply fantastic. I've watched a lot of videos and taken a course in NNs, and I've never seen an explanation like this. This was exactly what I was looking for. I don't know how you know what you know, or why no one else seems to be able to teach like you do, but THANK YOU!!!
Awesome, thank you!
Great Josh.
May Almighty Allah reward you.
Thank you!
Your voice can be used for Siri or Google Assistant when played in 1.5x , good luck starting new career as Artist .
:)
I have done tons of coursework on NNs, worked with them for years but you taught me something I never even thought to ask. Amazing job Josh, well done and thank you!
Thank you!
This is the best way to visualize the theory of a machine learning algorithm. Thank you Sir.
Thank you!
Whenever I go out with friends, in some way Starmer's name comes up hahaha. Thank you for this
BAM! :)
same!
@@saraaltamirano BAM! And thank you for your support!!! BAM! :)
Josh Starmer could become the Bill Nye, the Science Guy of Statistics
Bam!
Facts!!
I was waiting for this topic Sir, Thanks a lot.....
:)
WTH. That's all maths. I am quitting neural networks and ML. I guess gotta open a general store instead of being a AI researcher
@@yashaswikulshreshtha1588 bro if you don't like maths then may be you are not made for AI/ML( But if you work hard you can learn.... just need focus)
Life is sorted when we have a tutor like you. Love your videos
Thanks!
I cannot say it enough how much I feel grateful towards you for making these videos. God bless you
Wow, thank you!
Thank you Josh for such a wonderful and detailed explanation of NN, very helpful. Can't wait for Part 2 !! 🙂
I'm working on it right now and hope to have it out soon.
Just what I needed, epic
BAM! :)
I'm gonna use that in my curriculum:
Proficient in Big Fancy Squiggle Fitting Machines
Bam! :)
Green
I have a coursework deadline coming up and this video has SAVED ME
Good luck with your coursework! :)
You made laugh, get interest and BAAMM! I'm studying a course of Machine Learning and got a profound level of stats, so I needed a couple of videos of depth and I found you. This channel is amazing, I'm gonna watch everything. You're awesome.
Thank you very much! All of my videos are organized on this page: statquest.org/video-index/
I wish a lot of fame and money for you.
Thank you!