Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html
Thankyou soo much for sharing your knowledge and making these videos! This is the best course so far to learn fundamentals !!! ( I havent tried advanced courses, but now I have high expectations) always 10000% to the point, and no words wasted.
You're so welcome, Amogh! After completing the fundamentals course, you'll be set to move on to the various other deep learning courses available on deeplizard.com. Browse the site home page to get an overview of the courses and see the recommended order for which to take them.
great explanation mam thank you really helped me out. I was not able to answer a question in the Nvidia DLI course. Thank you Mam keep going whenever possible I'll contribute to your Patreon its that good.....
I was working on a text classification problem with ann where I encountered with overfitting problem . my model had 99 percent accuracy on training set and just 75 percent on test set after performing parameter tuning (setting drop out, no of epoch , weight transform etc) training accuracy was reduced to 94-95 percent and testing accuracy was 78 . I was able to reduce overfitting to a small extent but what more can do to improve it??? please help
If the validation metrics are considerably worse than the training metrics, then that is indication that our model is overfitting. How to know this from looking at metrics, How do you know in this video?
You can compare the validation accuracy to the training accuracy. If the validation accuracy is considerably lower than the training accuracy, then that indicates overfitting.
@@deeplizard i have similar question. Seeing the above model.fit, how can we say it is actually overfitting or it is ok to have certain numbers of up and down. If we have 85% of accuracy in training and 82% in validation, do you consider it as a overfit or normal? can you give some more suggestion on how to analyze our model performance base on training and validation. What should we look and by how much. Waiting for your reply :)
@@deeplizard and how do you estimate whether or not "considerably" lower. Is there a criteria or test? Because it sounds like it would just be a personal judgement. Thanks in advance!
{ "question": "All of the following can help reduce overfitting EXCEPT:", "choices": [ "Greater availability of labels", "Increased data diversity", "Data augmentation", "Reduced model complexity" ], "answer": "Greater availability of labels", "creator": "Chris", "creationDate": "2019-12-11T04:46:40.163Z" }
well its not mentioned, but maybe does greater availability of labels not reduce overfitting? even though it would most likely reduce the accuracy either way...
@@-dialecticsforkids2978 Looking back, I'm not totally sure what I meant when I said, "Greater availability of labels" haha. In this case, as is explored in "Supervised Learning Explained," we're examining supervised learning, so labels are available in every instance of the training data. Really, "Greater availability of labels" is just a nonsensical answer because availability cannot increase. If we think of it as an increased _number_ of labels, however, which is what I gather from your answer, then yes, it would help reduce overfitting because that is a subset of increased data diversity.
Hello all, Would appreciate some eyes on this : stackoverflow.com/questions/59129348/how-to-deal-with-overfitting-with-simple-x-y-data-in-mlpregressor Just trying to figure out what I am doing wrong with simple X,Y Neural net, should be no problem for you guys. Thanks everyone and thank you deeplizard for the video.
Overfitting - it is when you get used to "like this video and subscribe to the channel" at the end of all UA-cam videos, and then when you don't hear it you think it is not a UA-cam video)
Machine Learning / Deep Learning Tutorials for Programmers playlist: ua-cam.com/play/PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU.html
Keras Machine Learning / Deep Learning Tutorial playlist: ua-cam.com/play/PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL.html
The explanation couldn't have been better! You nailed it! Thank you.
Best video regarding this topic I have found so far - thank! 🙂
Thank you very much for this video! The explanation for what overfitting is, as well as the possible methods to tackle it, really help!
Thank you so so much! Your videos are so short, concise and relatable.
Thanks to you I graduated! 🎓 @deeplizard
Fantastic! I finally understand the fundamentals of DL.
so comprehensive, yet short and to the point. thanks so much for taking the time to do these videos.
Thank you so much, that was really clear, et very helpful !
You're welcome, Zakaria! Glad the video was helpful!
Thankyou soo much for sharing your knowledge and making these videos! This is the best course so far to learn fundamentals !!! ( I havent tried advanced courses, but now I have high expectations) always 10000% to the point, and no words wasted.
You're so welcome, Amogh! After completing the fundamentals course, you'll be set to move on to the various other deep learning courses available on deeplizard.com. Browse the site home page to get an overview of the courses and see the recommended order for which to take them.
Wow, the explanation is so clear! Thank you :)
Beautiful video ! Thanks !
great explanation mam thank you really helped me out. I was not able to answer a question in the Nvidia DLI course. Thank you Mam keep going whenever possible I'll contribute to your Patreon its that good.....
thank you very much for this clear and helpful explanation.
nice explaination, thanks for your video
Awesome teaching
Love this very useful!
best channel ever
Thanks! Very helpful.
Hi thanks! how about adding noise does this help with overfitting?
Perhaps, but this would only be useful if the data that you would be using in production with your model were inherently noisy.
just thanks a lot
awsome explanation
Well explained. Thanks alot :)
I was working on a text classification problem with ann where I encountered with overfitting problem . my model had 99 percent accuracy on training set and just 75 percent on test set after performing parameter tuning (setting drop out, no of epoch , weight transform etc) training accuracy was reduced to 94-95 percent and testing accuracy was 78 . I was able to reduce overfitting to a small extent but what more can do to improve it??? please help
Really well explained!
Thanks, Ahmed!
If the validation metrics are considerably worse than the training metrics, then that is indication that our model is overfitting.
How to know this from looking at metrics, How do you know in this video?
You can compare the validation accuracy to the training accuracy. If the validation accuracy is considerably lower than the training accuracy, then that indicates overfitting.
@@deeplizard i have similar question. Seeing the above model.fit, how can we say it is actually overfitting or it is ok to have certain numbers of up and down. If we have 85% of accuracy in training and 82% in validation, do you consider it as a overfit or normal?
can you give some more suggestion on how to analyze our model performance base on training and validation. What should we look and by how much.
Waiting for your reply :)
@@deeplizard and how do you estimate whether or not "considerably" lower. Is there a criteria or test? Because it sounds like it would just be a personal judgement. Thanks in advance!
Early stopping also helps! :)
Thank you for making these videos! But is dropout out still popular nowadays?
Thanks a lot
Sooooooooooooooooo great video
How can reducing the layer or reducing the number of neurons help in reducing the Overfitting of the model?
how do we use data augmentation for text data?
the same question i have
Thanks very much. Love you :)
Your videos are so good. Thank you very much ❤️...wants to donate.
You're welcome, Indranil! And thank you!
We have a Patreon set up for anyone who'd like to contribute:
deeplizard.com/hivemind
❤
{
"question": "All of the following can help reduce overfitting EXCEPT:",
"choices": [
"Greater availability of labels",
"Increased data diversity",
"Data augmentation",
"Reduced model complexity"
],
"answer": "Greater availability of labels",
"creator": "Chris",
"creationDate": "2019-12-11T04:46:40.163Z"
}
Thanks, Chris! Just added your question to deeplizard.com
well its not mentioned, but maybe does greater availability of labels not reduce overfitting? even though it would most likely reduce the accuracy either way...
@@-dialecticsforkids2978 Looking back, I'm not totally sure what I meant when I said, "Greater availability of labels" haha. In this case, as is explored in "Supervised Learning Explained," we're examining supervised learning, so labels are available in every instance of the training data. Really, "Greater availability of labels" is just a nonsensical answer because availability cannot increase. If we think of it as an increased _number_ of labels, however, which is what I gather from your answer, then yes, it would help reduce overfitting because that is a subset of increased data diversity.
Nice
ECE 449 UofA
Nice video but you need to decrease the speech speed a little bit
I have in later videos :)
Also, each video has a corresponding written blog on deeplizard.com that you can check out for a slower pace :D
you can push SHIFT + < to slow it down
@@carlosmontesparra8548 Thanks but it didn't work
for the alg
Hello all,
Would appreciate some eyes on this :
stackoverflow.com/questions/59129348/how-to-deal-with-overfitting-with-simple-x-y-data-in-mlpregressor
Just trying to figure out what I am doing wrong with simple X,Y Neural net, should be no problem for you guys. Thanks everyone and thank you deeplizard for the video.
Subbed to channel
Still not getting the essence of overfitting
Overfitting - it is when you get used to "like this video and subscribe to the channel" at the end of all UA-cam videos, and then when you don't hear it you think it is not a UA-cam video)
Lol